Feb 17 16:42:11 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 16:42:11 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:11 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 16:42:12 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 16:42:12 crc kubenswrapper[4694]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.671433 4694 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677552 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677576 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677582 4694 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677587 4694 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677593 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677598 4694 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677604 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677613 4694 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677635 4694 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677641 4694 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677647 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677652 4694 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677658 4694 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677663 4694 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677667 4694 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677672 4694 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677677 4694 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677682 4694 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677687 4694 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677692 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677697 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677711 4694 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677717 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677721 4694 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677726 4694 feature_gate.go:330] unrecognized feature gate: Example Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677731 4694 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677736 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677741 4694 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677746 4694 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677751 4694 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677758 4694 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677765 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677770 4694 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677776 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677781 4694 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677787 4694 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677792 4694 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677797 4694 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677803 4694 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677808 4694 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677813 4694 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677819 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677824 4694 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677829 4694 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677834 4694 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677839 4694 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677846 4694 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677852 4694 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677857 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677862 4694 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677870 4694 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677876 4694 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677881 4694 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677887 4694 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677892 4694 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677899 4694 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677905 4694 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677910 4694 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677915 4694 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677920 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677925 4694 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677930 4694 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677935 4694 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677940 4694 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677944 4694 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677949 4694 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677954 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677960 4694 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677965 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677970 4694 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.677975 4694 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678065 4694 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678075 4694 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678085 4694 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678093 4694 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678102 4694 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678110 4694 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678119 4694 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678127 4694 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678135 4694 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678143 4694 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678151 4694 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678158 4694 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678164 4694 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678170 4694 flags.go:64] FLAG: --cgroup-root="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678175 4694 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678181 4694 flags.go:64] FLAG: --client-ca-file="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678187 4694 flags.go:64] FLAG: --cloud-config="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678192 4694 flags.go:64] FLAG: --cloud-provider="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678198 4694 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678207 4694 flags.go:64] FLAG: --cluster-domain="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678215 4694 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678221 4694 flags.go:64] FLAG: --config-dir="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678227 4694 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678233 4694 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678242 4694 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678247 4694 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678253 4694 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678259 4694 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678265 4694 flags.go:64] FLAG: --contention-profiling="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678270 4694 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678276 4694 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678282 4694 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678287 4694 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678294 4694 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678300 4694 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678306 4694 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678311 4694 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678317 4694 flags.go:64] FLAG: --enable-server="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678323 4694 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678329 4694 flags.go:64] FLAG: --event-burst="100" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678335 4694 flags.go:64] FLAG: --event-qps="50" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678341 4694 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678347 4694 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678353 4694 flags.go:64] FLAG: --eviction-hard="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678360 4694 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678365 4694 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678371 4694 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678377 4694 flags.go:64] FLAG: --eviction-soft="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678382 4694 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678388 4694 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678393 4694 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678399 4694 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678405 4694 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678410 4694 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678416 4694 flags.go:64] FLAG: --feature-gates="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678422 4694 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678428 4694 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678434 4694 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678440 4694 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678446 4694 flags.go:64] FLAG: --healthz-port="10248" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678452 4694 flags.go:64] FLAG: --help="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678458 4694 flags.go:64] FLAG: --hostname-override="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678463 4694 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678469 4694 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678475 4694 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678480 4694 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678486 4694 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678492 4694 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678497 4694 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678503 4694 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678508 4694 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678515 4694 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678521 4694 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678527 4694 flags.go:64] FLAG: --kube-reserved="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678532 4694 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678538 4694 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678544 4694 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678549 4694 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678555 4694 flags.go:64] FLAG: --lock-file="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678560 4694 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678566 4694 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678571 4694 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678580 4694 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678585 4694 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678591 4694 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678596 4694 flags.go:64] FLAG: --logging-format="text" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678602 4694 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678612 4694 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678637 4694 flags.go:64] FLAG: --manifest-url="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678642 4694 flags.go:64] FLAG: --manifest-url-header="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678650 4694 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678656 4694 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678663 4694 flags.go:64] FLAG: --max-pods="110" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678669 4694 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678675 4694 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678681 4694 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678687 4694 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678692 4694 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678699 4694 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678704 4694 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678717 4694 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678723 4694 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678728 4694 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678734 4694 flags.go:64] FLAG: --pod-cidr="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678739 4694 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678748 4694 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678753 4694 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678759 4694 flags.go:64] FLAG: --pods-per-core="0" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678765 4694 flags.go:64] FLAG: --port="10250" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678770 4694 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678776 4694 flags.go:64] FLAG: --provider-id="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678781 4694 flags.go:64] FLAG: --qos-reserved="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678787 4694 flags.go:64] FLAG: --read-only-port="10255" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678793 4694 flags.go:64] FLAG: --register-node="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678799 4694 flags.go:64] FLAG: --register-schedulable="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678804 4694 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678813 4694 flags.go:64] FLAG: --registry-burst="10" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678819 4694 flags.go:64] FLAG: --registry-qps="5" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678825 4694 flags.go:64] FLAG: --reserved-cpus="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678830 4694 flags.go:64] FLAG: --reserved-memory="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678837 4694 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678843 4694 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678848 4694 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678854 4694 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678860 4694 flags.go:64] FLAG: --runonce="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678865 4694 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678872 4694 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678877 4694 flags.go:64] FLAG: --seccomp-default="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678883 4694 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678889 4694 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678895 4694 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678901 4694 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678907 4694 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678913 4694 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678918 4694 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678924 4694 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678930 4694 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678935 4694 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678941 4694 flags.go:64] FLAG: --system-cgroups="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678947 4694 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678955 4694 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678961 4694 flags.go:64] FLAG: --tls-cert-file="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678966 4694 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678973 4694 flags.go:64] FLAG: --tls-min-version="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678979 4694 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678984 4694 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678990 4694 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.678995 4694 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.679001 4694 flags.go:64] FLAG: --v="2" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.679008 4694 flags.go:64] FLAG: --version="false" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.679015 4694 flags.go:64] FLAG: --vmodule="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.679022 4694 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.679028 4694 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679178 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679186 4694 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679191 4694 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679199 4694 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679206 4694 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679211 4694 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679217 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679223 4694 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679228 4694 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679233 4694 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679244 4694 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679249 4694 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679255 4694 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679261 4694 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679266 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679272 4694 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679277 4694 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679282 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679286 4694 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679293 4694 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679338 4694 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679345 4694 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679352 4694 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679368 4694 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679375 4694 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679382 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679388 4694 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679394 4694 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679401 4694 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679406 4694 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679412 4694 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679418 4694 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679424 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679430 4694 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679436 4694 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679442 4694 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679448 4694 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679454 4694 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679461 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679466 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679472 4694 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679478 4694 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679487 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679493 4694 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679573 4694 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679580 4694 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679593 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679599 4694 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679605 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679636 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679642 4694 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679648 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679655 4694 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679661 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679666 4694 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679674 4694 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679678 4694 feature_gate.go:330] unrecognized feature gate: Example Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679683 4694 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679688 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679693 4694 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679698 4694 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679702 4694 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679707 4694 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679712 4694 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679718 4694 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679722 4694 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679727 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679732 4694 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679739 4694 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679745 4694 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.679750 4694 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.681966 4694 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.693703 4694 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.694014 4694 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694323 4694 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694344 4694 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694352 4694 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694359 4694 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694367 4694 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694375 4694 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694382 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694389 4694 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694396 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694403 4694 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694411 4694 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694421 4694 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694429 4694 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694437 4694 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694444 4694 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694451 4694 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694458 4694 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694465 4694 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694472 4694 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694478 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694484 4694 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694491 4694 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694498 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694504 4694 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694510 4694 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694516 4694 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694523 4694 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694546 4694 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694553 4694 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694559 4694 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694565 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694572 4694 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694578 4694 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694584 4694 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694590 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694596 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694607 4694 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694645 4694 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694652 4694 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694658 4694 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694668 4694 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694677 4694 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694685 4694 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694691 4694 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694698 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694705 4694 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694713 4694 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694719 4694 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694726 4694 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694732 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694739 4694 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694745 4694 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694751 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694758 4694 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694764 4694 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694770 4694 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694776 4694 feature_gate.go:330] unrecognized feature gate: Example Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694782 4694 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694789 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694795 4694 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694804 4694 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694811 4694 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694818 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694824 4694 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694830 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694837 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694845 4694 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694852 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694858 4694 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694865 4694 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.694871 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.694880 4694 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695041 4694 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695053 4694 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695060 4694 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695069 4694 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695078 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695085 4694 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695092 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695098 4694 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695105 4694 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695112 4694 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695118 4694 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695124 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695130 4694 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695138 4694 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695145 4694 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695151 4694 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695157 4694 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695164 4694 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695170 4694 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695176 4694 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695182 4694 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695188 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695195 4694 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695201 4694 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695207 4694 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695213 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695219 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695225 4694 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695231 4694 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695238 4694 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695244 4694 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695249 4694 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695256 4694 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695262 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695268 4694 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695274 4694 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695280 4694 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695288 4694 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695297 4694 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695303 4694 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695312 4694 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695319 4694 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695326 4694 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695333 4694 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695339 4694 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695346 4694 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695352 4694 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695358 4694 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695364 4694 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695370 4694 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695376 4694 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695382 4694 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695388 4694 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695395 4694 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695401 4694 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695407 4694 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695413 4694 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695420 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695426 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695432 4694 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695438 4694 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695446 4694 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695454 4694 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695460 4694 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695467 4694 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695473 4694 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695481 4694 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695487 4694 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695494 4694 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695501 4694 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.695508 4694 feature_gate.go:330] unrecognized feature gate: Example Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.695518 4694 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.695701 4694 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.702986 4694 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.703110 4694 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.707001 4694 server.go:997] "Starting client certificate rotation" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.707035 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.707167 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-18 16:08:17.886089108 +0000 UTC Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.707234 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.732580 4694 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.735110 4694 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.736155 4694 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.750076 4694 log.go:25] "Validated CRI v1 runtime API" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.789017 4694 log.go:25] "Validated CRI v1 image API" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.790811 4694 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.797763 4694 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-16-36-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.798025 4694 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.819887 4694 manager.go:217] Machine: {Timestamp:2026-02-17 16:42:12.816560237 +0000 UTC m=+0.573635601 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2d3fb0f7-c717-4d67-9d61-62b30d044694 BootID:d3d94249-43cc-4da5-9743-6861e47e40f5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e4:f6:8d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e4:f6:8d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c2:7b:1f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:75:4f:62 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:37:7f:5f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c1:05:44 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5a:2f:86:11:05:f5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:26:4d:6c:77:a3:f6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.821376 4694 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.821994 4694 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.825074 4694 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.825361 4694 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.825403 4694 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.825739 4694 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.825756 4694 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.826428 4694 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.826479 4694 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.826752 4694 state_mem.go:36] "Initialized new in-memory state store" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.826881 4694 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.830730 4694 kubelet.go:418] "Attempting to sync node with API server" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.830748 4694 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.830772 4694 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.830782 4694 kubelet.go:324] "Adding apiserver pod source" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.830792 4694 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.836219 4694 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.836446 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.836448 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.836504 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.836525 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.837241 4694 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.839097 4694 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840631 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840666 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840678 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840686 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840700 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840709 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840719 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840733 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840743 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840752 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840763 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.840772 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.841740 4694 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.842250 4694 server.go:1280] "Started kubelet" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.843054 4694 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.843576 4694 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 16:42:12 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.844088 4694 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.844961 4694 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846052 4694 server.go:460] "Adding debug handlers to kubelet server" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846147 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846196 4694 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846364 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:05:34.473356148 +0000 UTC Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846901 4694 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846929 4694 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.846949 4694 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.846894 4694 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.847641 4694 factory.go:55] Registering systemd factory Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.847662 4694 factory.go:221] Registration of the systemd container factory successfully Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.847766 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.847811 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.847858 4694 factory.go:153] Registering CRI-O factory Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.847868 4694 factory.go:221] Registration of the crio container factory successfully Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.848043 4694 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.848063 4694 factory.go:103] Registering Raw factory Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.848077 4694 manager.go:1196] Started watching for new ooms in manager Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.854473 4694 manager.go:319] Starting recovery of all containers Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.855548 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.858251 4694 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18951649602d8b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 16:42:12.842179436 +0000 UTC m=+0.599254770,LastTimestamp:2026-02-17 16:42:12.842179436 +0000 UTC m=+0.599254770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866588 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866676 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866700 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866719 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866735 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866752 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866770 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866786 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866810 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866855 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866871 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866889 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866904 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866922 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866938 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866953 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866973 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.866989 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867006 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867024 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867041 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867058 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867077 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867095 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867113 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867130 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867153 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867175 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867194 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867257 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867276 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867293 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867312 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867372 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867390 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867407 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.867425 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869185 4694 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869225 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869246 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869264 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869280 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869295 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869311 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869330 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869347 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869363 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869379 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869398 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869415 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869432 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869449 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869466 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869490 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869509 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869528 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869546 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869562 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869580 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869598 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869640 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869658 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869677 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869695 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869714 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869731 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869748 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869764 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869781 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869799 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869816 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869832 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869849 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869881 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869899 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869914 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869931 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869948 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869964 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.869980 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870000 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870016 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870034 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870051 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870067 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870084 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870102 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870118 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870134 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870151 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870168 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870185 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870203 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870219 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870235 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870251 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870268 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870283 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870348 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870369 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870388 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870406 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870423 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870440 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870486 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870511 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870536 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870555 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870572 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870590 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870606 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870648 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870673 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870693 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870713 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870730 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870746 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870762 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870778 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870795 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870813 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870831 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870852 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870869 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870885 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870904 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870920 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870937 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870956 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870973 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.870992 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871010 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871027 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871046 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871065 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871084 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871103 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871122 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871138 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871154 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871170 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871188 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871204 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871220 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871240 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871257 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871276 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871292 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871308 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871324 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871340 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871355 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871372 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871393 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871405 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871418 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871431 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871443 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871455 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871467 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871482 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871494 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871505 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871518 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871530 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871545 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871558 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871570 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871582 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871594 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871681 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871699 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871713 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871729 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871745 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871764 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871791 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871809 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871827 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871844 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871859 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871872 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871884 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871895 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871907 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871920 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871931 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871943 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871955 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871966 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871979 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.871990 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872003 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872016 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872029 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872043 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872056 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872070 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872083 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872098 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872114 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872130 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872146 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872163 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872179 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872195 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872212 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872228 4694 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872245 4694 reconstruct.go:97] "Volume reconstruction finished" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.872256 4694 reconciler.go:26] "Reconciler: start to sync state" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.875760 4694 manager.go:324] Recovery completed Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.890510 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.891171 4694 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.893919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.893970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.893986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.894086 4694 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.894116 4694 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.894141 4694 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.894187 4694 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 16:42:12 crc kubenswrapper[4694]: W0217 16:42:12.894999 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.895059 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.896169 4694 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.896192 4694 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.896213 4694 state_mem.go:36] "Initialized new in-memory state store" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.918335 4694 policy_none.go:49] "None policy: Start" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.919097 4694 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.919139 4694 state_mem.go:35] "Initializing new in-memory state store" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.947696 4694 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990105 4694 manager.go:334] "Starting Device Plugin manager" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990150 4694 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990169 4694 server.go:79] "Starting device plugin registration server" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990550 4694 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990565 4694 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990768 4694 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990928 4694 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.990945 4694 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.994305 4694 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.994379 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995134 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995162 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995257 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995663 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995725 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995856 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995873 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995881 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.995954 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996146 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996201 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996661 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996681 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996802 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996747 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996839 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996856 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: E0217 16:42:12.996772 4694 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.996973 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997029 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997056 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997110 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997147 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997808 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997835 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997912 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997928 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998019 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998046 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.997932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998302 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998684 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998696 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998762 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998793 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998802 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998953 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.998973 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.999713 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.999728 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:12 crc kubenswrapper[4694]: I0217 16:42:12.999737 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.056394 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.075319 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.075496 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.075666 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.075810 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.075975 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.076119 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.076298 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.076442 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.076952 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.077241 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.077402 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.077575 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.077744 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.077899 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.078045 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.091097 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.092825 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.092874 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.092921 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.092955 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.093869 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179574 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179646 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179671 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179693 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179721 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179743 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179752 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179810 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179827 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179851 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179855 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179900 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179874 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179867 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179932 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179933 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179954 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179960 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179999 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179900 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179974 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.179877 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180061 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180040 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180094 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180113 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180132 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180155 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180182 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.180253 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.294409 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.295882 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.295934 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.295951 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.295985 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.296580 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.339371 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.353662 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.379188 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.407846 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.418116 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.458020 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.470202 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f15723b004355246a6353cb4d91a0eb952780f24b42b54122987a3722b0bc4e6 WatchSource:0}: Error finding container f15723b004355246a6353cb4d91a0eb952780f24b42b54122987a3722b0bc4e6: Status 404 returned error can't find the container with id f15723b004355246a6353cb4d91a0eb952780f24b42b54122987a3722b0bc4e6 Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.472163 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c461ff3db2434a358eb88178fd1e3028534ac7853d813e547ef040588180df6e WatchSource:0}: Error finding container c461ff3db2434a358eb88178fd1e3028534ac7853d813e547ef040588180df6e: Status 404 returned error can't find the container with id c461ff3db2434a358eb88178fd1e3028534ac7853d813e547ef040588180df6e Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.472923 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-75c1a1a4aa2594e730287b619f1f8e6d91aa3d4b4bdc7b003bd9a272870474db WatchSource:0}: Error finding container 75c1a1a4aa2594e730287b619f1f8e6d91aa3d4b4bdc7b003bd9a272870474db: Status 404 returned error can't find the container with id 75c1a1a4aa2594e730287b619f1f8e6d91aa3d4b4bdc7b003bd9a272870474db Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.474773 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-06071617dc3961f8a891321f53c22c41e73addcaf080f2964cdffd8442464bf9 WatchSource:0}: Error finding container 06071617dc3961f8a891321f53c22c41e73addcaf080f2964cdffd8442464bf9: Status 404 returned error can't find the container with id 06071617dc3961f8a891321f53c22c41e73addcaf080f2964cdffd8442464bf9 Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.475488 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-805fa57651f2ab76c3f33b9afc75af4841535fc149d75cc04c2bce15875ed549 WatchSource:0}: Error finding container 805fa57651f2ab76c3f33b9afc75af4841535fc149d75cc04c2bce15875ed549: Status 404 returned error can't find the container with id 805fa57651f2ab76c3f33b9afc75af4841535fc149d75cc04c2bce15875ed549 Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.605664 4694 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18951649602d8b6c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 16:42:12.842179436 +0000 UTC m=+0.599254770,LastTimestamp:2026-02-17 16:42:12.842179436 +0000 UTC m=+0.599254770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.697755 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.700896 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.700998 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.701028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.701076 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.701741 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.846033 4694 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.846998 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 10:06:48.300237676 +0000 UTC Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.898413 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c461ff3db2434a358eb88178fd1e3028534ac7853d813e547ef040588180df6e"} Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.899320 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"06071617dc3961f8a891321f53c22c41e73addcaf080f2964cdffd8442464bf9"} Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.900473 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"805fa57651f2ab76c3f33b9afc75af4841535fc149d75cc04c2bce15875ed549"} Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.902305 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"75c1a1a4aa2594e730287b619f1f8e6d91aa3d4b4bdc7b003bd9a272870474db"} Feb 17 16:42:13 crc kubenswrapper[4694]: I0217 16:42:13.905283 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f15723b004355246a6353cb4d91a0eb952780f24b42b54122987a3722b0bc4e6"} Feb 17 16:42:13 crc kubenswrapper[4694]: W0217 16:42:13.937378 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:13 crc kubenswrapper[4694]: E0217 16:42:13.937477 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:14 crc kubenswrapper[4694]: W0217 16:42:14.024938 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.025040 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:14 crc kubenswrapper[4694]: W0217 16:42:14.208926 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.209010 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.259306 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Feb 17 16:42:14 crc kubenswrapper[4694]: W0217 16:42:14.425113 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.425545 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.502786 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.504330 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.504374 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.504384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.504412 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.504921 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.846569 4694 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.847512 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:26:22.497391056 +0000 UTC Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.877858 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 16:42:14 crc kubenswrapper[4694]: E0217 16:42:14.879588 4694 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.911372 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9" exitCode=0 Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.911445 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.911708 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.912654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.912713 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.912731 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.913928 4694 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8" exitCode=0 Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.914015 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.914086 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.915405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.915445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.915462 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.915528 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.916844 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.916868 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.916879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.917423 4694 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d" exitCode=0 Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.917552 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.917566 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.918399 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.918427 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.918437 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.922323 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.922360 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.922374 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.922389 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.922458 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.923176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.923209 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.923225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.924529 4694 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338" exitCode=0 Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.924568 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338"} Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.924771 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.926367 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.926414 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:14 crc kubenswrapper[4694]: I0217 16:42:14.926437 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.846211 4694 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.848219 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 02:14:54.407460382 +0000 UTC Feb 17 16:42:15 crc kubenswrapper[4694]: E0217 16:42:15.860036 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.931288 4694 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05" exitCode=0 Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.931339 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.931470 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.932528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.932558 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.932567 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.935144 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.935193 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.935206 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.935217 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.937249 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.937282 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.938268 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.938461 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.938482 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940239 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940259 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940270 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940283 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29"} Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940300 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.940981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.941018 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.941032 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.941345 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.941379 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:15 crc kubenswrapper[4694]: I0217 16:42:15.941390 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.105327 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.110468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.110499 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.110508 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.110528 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:16 crc kubenswrapper[4694]: E0217 16:42:16.110866 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Feb 17 16:42:16 crc kubenswrapper[4694]: W0217 16:42:16.287523 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Feb 17 16:42:16 crc kubenswrapper[4694]: E0217 16:42:16.287677 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.849378 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:09:37.982925974 +0000 UTC Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.945983 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa"} Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.946032 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.946962 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.947007 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.947024 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948665 4694 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868" exitCode=0 Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948749 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948761 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948755 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868"} Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948825 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.948866 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952091 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952117 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952142 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952161 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952174 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952125 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952214 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:16 crc kubenswrapper[4694]: I0217 16:42:16.952181 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.674724 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.850521 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:03:26.393641092 +0000 UTC Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.957836 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d"} Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.957893 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63"} Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.957917 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae"} Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.957930 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.957935 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0"} Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.958111 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.958052 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959387 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959582 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:17 crc kubenswrapper[4694]: I0217 16:42:17.959696 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.199092 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.199296 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.201078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.201180 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.201202 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.851392 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:35:26.5272795 +0000 UTC Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.972045 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319"} Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.972117 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.972235 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.973784 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.973849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.973874 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.974112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.974158 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:18 crc kubenswrapper[4694]: I0217 16:42:18.974180 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.050949 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.310970 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.312765 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.312824 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.312843 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.312880 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.852647 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 10:10:50.905832101 +0000 UTC Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.974452 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.975215 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.975256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.975268 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.987961 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.988110 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.989130 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.989160 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:19 crc kubenswrapper[4694]: I0217 16:42:19.989169 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.222106 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.462692 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.462886 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.464210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.464268 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.464327 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.585666 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.633359 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.638492 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.852991 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 03:26:45.368168759 +0000 UTC Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.977836 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.977836 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979713 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979713 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979757 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979723 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:20 crc kubenswrapper[4694]: I0217 16:42:20.979782 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:21 crc kubenswrapper[4694]: I0217 16:42:21.853595 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 03:39:01.743470701 +0000 UTC Feb 17 16:42:21 crc kubenswrapper[4694]: I0217 16:42:21.979961 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:21 crc kubenswrapper[4694]: I0217 16:42:21.981132 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:21 crc kubenswrapper[4694]: I0217 16:42:21.981182 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:21 crc kubenswrapper[4694]: I0217 16:42:21.981196 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.662127 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.662290 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.663384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.663415 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.663425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:22 crc kubenswrapper[4694]: I0217 16:42:22.853744 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 19:37:24.972713787 +0000 UTC Feb 17 16:42:22 crc kubenswrapper[4694]: E0217 16:42:22.997148 4694 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 16:42:23 crc kubenswrapper[4694]: I0217 16:42:23.586315 4694 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 16:42:23 crc kubenswrapper[4694]: I0217 16:42:23.586450 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 16:42:23 crc kubenswrapper[4694]: I0217 16:42:23.854340 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:01:58.90520878 +0000 UTC Feb 17 16:42:24 crc kubenswrapper[4694]: I0217 16:42:24.854935 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:38:43.064877222 +0000 UTC Feb 17 16:42:25 crc kubenswrapper[4694]: I0217 16:42:25.855808 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:02:55.31974777 +0000 UTC Feb 17 16:42:26 crc kubenswrapper[4694]: W0217 16:42:26.648308 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.648396 4694 trace.go:236] Trace[100337929]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 16:42:16.647) (total time: 10001ms): Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[100337929]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:42:26.648) Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[100337929]: [10.00118895s] [10.00118895s] END Feb 17 16:42:26 crc kubenswrapper[4694]: E0217 16:42:26.648415 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 16:42:26 crc kubenswrapper[4694]: W0217 16:42:26.656940 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.657023 4694 trace.go:236] Trace[744263803]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 16:42:16.655) (total time: 10001ms): Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[744263803]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:42:26.656) Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[744263803]: [10.001218999s] [10.001218999s] END Feb 17 16:42:26 crc kubenswrapper[4694]: E0217 16:42:26.657044 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 16:42:26 crc kubenswrapper[4694]: W0217 16:42:26.711726 4694 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.711826 4694 trace.go:236] Trace[1931879947]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 16:42:16.710) (total time: 10001ms): Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[1931879947]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (16:42:26.711) Feb 17 16:42:26 crc kubenswrapper[4694]: Trace[1931879947]: [10.001026338s] [10.001026338s] END Feb 17 16:42:26 crc kubenswrapper[4694]: E0217 16:42:26.711851 4694 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.847076 4694 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.856117 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 21:28:58.565481279 +0000 UTC Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.996215 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.998692 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa" exitCode=255 Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.998747 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa"} Feb 17 16:42:26 crc kubenswrapper[4694]: I0217 16:42:26.998952 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.000123 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.000194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.000207 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.000787 4694 scope.go:117] "RemoveContainer" containerID="e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.119698 4694 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.119792 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.124434 4694 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.124508 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 16:42:27 crc kubenswrapper[4694]: I0217 16:42:27.856848 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:05:47.73994655 +0000 UTC Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.003699 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.006353 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22"} Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.006646 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.007965 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.008019 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.008039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:28 crc kubenswrapper[4694]: I0217 16:42:28.857247 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 19:20:08.622604582 +0000 UTC Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.857709 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 18:10:28.508358024 +0000 UTC Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.996360 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.996646 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.996767 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.998410 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.998448 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:29 crc kubenswrapper[4694]: I0217 16:42:29.998456 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.003068 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.010377 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.011308 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.011343 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.011353 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.470474 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.470737 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.472314 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.472365 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.472381 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:30 crc kubenswrapper[4694]: I0217 16:42:30.858596 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:16:22.325356625 +0000 UTC Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.013138 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.014135 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.014192 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.014204 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.300355 4694 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:31 crc kubenswrapper[4694]: I0217 16:42:31.859245 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 22:29:20.677880797 +0000 UTC Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.118370 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.124645 4694 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.124829 4694 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.124992 4694 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.155507 4694 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.324763 4694 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.468718 4694 csr.go:261] certificate signing request csr-vwsdm is approved, waiting to be issued Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.480765 4694 csr.go:257] certificate signing request csr-vwsdm is issued Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.586480 4694 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.687267 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.698256 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.707231 4694 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 16:42:32 crc kubenswrapper[4694]: W0217 16:42:32.707388 4694 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.707409 4694 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd/pods\": read tcp 38.102.83.75:48708->38.102.83.75:6443: use of closed network connection" pod="openshift-etcd/etcd-crc" Feb 17 16:42:32 crc kubenswrapper[4694]: W0217 16:42:32.707442 4694 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 16:42:32 crc kubenswrapper[4694]: W0217 16:42:32.707481 4694 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.730262 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.735744 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.841536 4694 apiserver.go:52] "Watching apiserver" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.844214 4694 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.844826 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.845330 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.845433 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.845485 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.845811 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.845945 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.845994 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.846006 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.846030 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.846413 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.847720 4694 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.848255 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.848366 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.848416 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.848725 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.849498 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.850642 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.850855 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.850963 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.852892 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.859719 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 08:46:58.796477755 +0000 UTC Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.870039 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.884838 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.895777 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.908091 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.917750 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.928373 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929659 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929710 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929732 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929759 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929781 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929802 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929820 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929838 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929859 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929877 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929897 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929944 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.929975 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930001 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930039 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930060 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930096 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930094 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930121 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930165 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930213 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930231 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930253 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930292 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930432 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930438 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930462 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930494 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930524 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930546 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930594 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930641 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930654 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930666 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930694 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930722 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930747 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930775 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930801 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930826 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930852 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930877 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930906 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930931 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930958 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.930987 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931015 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931066 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931119 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931148 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931173 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931201 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931227 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931259 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931313 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931340 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931343 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931374 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931407 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931470 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931500 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931530 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931558 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931586 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931634 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931664 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931696 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931726 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931754 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931785 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931817 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931845 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931842 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931876 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931963 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931968 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931988 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.931987 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932081 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932111 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932140 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932176 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932201 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932231 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932256 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932280 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932309 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932386 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932415 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932444 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932468 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932493 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932516 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932539 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932561 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932590 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932637 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932662 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932685 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932708 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932733 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932759 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932781 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932175 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932803 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932313 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932436 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932509 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932829 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932594 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932634 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932687 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932856 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932883 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932910 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932935 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932963 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932986 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933009 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933034 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933056 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933078 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933100 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933123 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933150 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933173 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933199 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933223 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933247 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935918 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.936331 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937475 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937528 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937563 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937596 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937639 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937676 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937717 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937748 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937779 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937802 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937827 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937853 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937898 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937921 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937947 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937975 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937996 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938018 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938041 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938061 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938085 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938108 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938131 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938151 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938173 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938260 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938281 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938306 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938329 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938490 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938665 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938697 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938719 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938739 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939272 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939302 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939329 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939352 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939378 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939402 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939457 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939504 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939525 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939549 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939656 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939679 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939701 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939725 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939749 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939769 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939823 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939854 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939891 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.940224 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.940653 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.932797 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933077 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.941854 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942212 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942345 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942383 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.933077 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935367 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935498 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935574 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935602 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935837 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.935908 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.936038 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.936142 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.936166 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.936767 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937169 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937482 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.937663 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938043 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938342 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938370 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938722 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938882 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938995 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.938774 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939110 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.939125 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.940545 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.940316 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.941364 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.941523 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942672 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942461 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942788 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942813 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942958 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943044 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943341 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943410 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943503 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943734 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.943786 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.942355 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.944496 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.944681 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.944783 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.944821 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.945399 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.945422 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.945729 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.945910 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946118 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946325 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946342 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946420 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946423 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946087 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946473 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946508 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946519 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946586 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946643 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.947106 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.947366 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.949214 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951032 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951083 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951112 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951143 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951165 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951190 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951216 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951243 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951272 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951297 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951324 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951348 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951374 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951400 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951426 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951474 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951502 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951529 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951555 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951581 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951624 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951654 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951684 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951713 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951773 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951806 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951839 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951870 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951906 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951937 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951966 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.951996 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952027 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952049 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952076 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952100 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952126 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952151 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952234 4694 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952252 4694 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952265 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952282 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952297 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952310 4694 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952323 4694 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952335 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952346 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952358 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952374 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952388 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952400 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952413 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952427 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952440 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952452 4694 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952466 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952481 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952493 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952507 4694 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952520 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952534 4694 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952547 4694 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952559 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952572 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952584 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952596 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952624 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952637 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952652 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952665 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952676 4694 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952687 4694 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952700 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952713 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952725 4694 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952738 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952757 4694 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952768 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952780 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952793 4694 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952805 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952817 4694 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952831 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952845 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952858 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952870 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952883 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952897 4694 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952910 4694 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952922 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952934 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952945 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952956 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952968 4694 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952980 4694 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.952994 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953008 4694 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953021 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953033 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953044 4694 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953057 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953068 4694 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953078 4694 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953090 4694 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953101 4694 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953115 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953128 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953141 4694 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953154 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953166 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953177 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953188 4694 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953199 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953213 4694 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953225 4694 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953237 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953249 4694 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953260 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953272 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.953284 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.954085 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.954427 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:33.454139303 +0000 UTC m=+21.211214677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.954785 4694 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.955392 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.955891 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.955971 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:33.455948099 +0000 UTC m=+21.213023513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.956059 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.946250 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.956757 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.957928 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.958067 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.966639 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:42:33.466586858 +0000 UTC m=+21.223662182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.967440 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.967625 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.967783 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.967885 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.968060 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.968122 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.968981 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.969084 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.969299 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971173 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971225 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971074 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971395 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971456 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971599 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.971723 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.971740 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.971790 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.971843 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:33.471826041 +0000 UTC m=+21.228901365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.971849 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.972001 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.972133 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.975904 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.975925 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.975937 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:32 crc kubenswrapper[4694]: E0217 16:42:32.975984 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:33.475969195 +0000 UTC m=+21.233044519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.976843 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.977007 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.978083 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.978741 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.979029 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.981363 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.984902 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.987908 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.988979 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.990852 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.991199 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.991596 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.992099 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.992172 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.992301 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.994059 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.995849 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.996125 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.996160 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.996680 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.997067 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:32 crc kubenswrapper[4694]: I0217 16:42:32.997745 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:32.999990 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.000387 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.000484 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.001162 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.001623 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.001982 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.005936 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.006156 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.007064 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.007183 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.007469 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.008128 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010139 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010327 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010421 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010471 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010634 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.010876 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.011012 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.011178 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.011863 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.012414 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.012440 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.012655 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.013792 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.015792 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.017421 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.017794 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.018740 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.019514 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.021379 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.021888 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.021947 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.022002 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.022444 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.022809 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.023342 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.026266 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.027191 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.029141 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.029562 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.030828 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.031066 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.032497 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.035283 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.035455 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.036687 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.036736 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.040914 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.041143 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.043164 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.043600 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.043899 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.044096 4694 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.044985 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.046228 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048090 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048329 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048390 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048641 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048636 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048758 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.048864 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.049154 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.049432 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.049463 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.049582 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.049798 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.050420 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.050921 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.050987 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.051037 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.051792 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.052051 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.052108 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.052320 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053731 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053758 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053805 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053908 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053940 4694 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053951 4694 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053960 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053969 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053977 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053987 4694 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.053996 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054005 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054014 4694 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054023 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054031 4694 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054039 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054047 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054055 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054064 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054073 4694 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054080 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054089 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054097 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054106 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054116 4694 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054124 4694 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054133 4694 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054144 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054155 4694 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054166 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054176 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054188 4694 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054200 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054212 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054224 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054234 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054248 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054258 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054268 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054278 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054289 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054298 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054307 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054316 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054324 4694 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054332 4694 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054340 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054348 4694 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054356 4694 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054365 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054373 4694 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054382 4694 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054390 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054397 4694 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054405 4694 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054414 4694 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054422 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054430 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054437 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054445 4694 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054453 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054460 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054468 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054476 4694 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054484 4694 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054492 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054499 4694 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054507 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054515 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054523 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054531 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054539 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054549 4694 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054559 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054569 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054580 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054590 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054599 4694 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054673 4694 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054688 4694 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054698 4694 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054707 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054717 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054726 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054734 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054742 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054750 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054759 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054767 4694 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054776 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054784 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054792 4694 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054799 4694 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054807 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054815 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054822 4694 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054830 4694 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054837 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054844 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054853 4694 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054861 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054869 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054877 4694 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054885 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054892 4694 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054900 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054908 4694 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054916 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054924 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054932 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054940 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054948 4694 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054956 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.054964 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.059508 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.065492 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.068833 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.069707 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.083251 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.096951 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.108229 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.118298 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.129721 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.150350 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.156098 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.156146 4694 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.161857 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.162753 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.176153 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.178041 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-61a6e44e17c06e637d62096a160ed7f92ea16cf763f91c4b7f2599e0b276aa99 WatchSource:0}: Error finding container 61a6e44e17c06e637d62096a160ed7f92ea16cf763f91c4b7f2599e0b276aa99: Status 404 returned error can't find the container with id 61a6e44e17c06e637d62096a160ed7f92ea16cf763f91c4b7f2599e0b276aa99 Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.181694 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.191703 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ddf14f0cd3d998e79e7685bb7be3c95148a8dcf157e0d826b5f0050973673a9c WatchSource:0}: Error finding container ddf14f0cd3d998e79e7685bb7be3c95148a8dcf157e0d826b5f0050973673a9c: Status 404 returned error can't find the container with id ddf14f0cd3d998e79e7685bb7be3c95148a8dcf157e0d826b5f0050973673a9c Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.195639 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5532df28ee3bdcf4e64306392f9f994419845e2b46a792d6047f81a6486b3508 WatchSource:0}: Error finding container 5532df28ee3bdcf4e64306392f9f994419845e2b46a792d6047f81a6486b3508: Status 404 returned error can't find the container with id 5532df28ee3bdcf4e64306392f9f994419845e2b46a792d6047f81a6486b3508 Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.373371 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5rjgs"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.373956 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.374813 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b5hgc"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.374986 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pj7v4"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.375119 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.375312 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.378164 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-d42qm"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.378600 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.378838 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.378855 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.380546 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.380795 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381150 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381285 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381413 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381507 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381600 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381705 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381812 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.381956 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.382084 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.382221 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.383643 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.423309 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.448081 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458513 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-os-release\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458556 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-hostroot\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458582 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-system-cni-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458603 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k98nx\" (UniqueName: \"kubernetes.io/projected/3af5c84a-80ed-47ac-a79d-25b46c8e956e-kube-api-access-k98nx\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458637 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-hosts-file\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458718 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458738 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-etc-kubernetes\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458757 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458776 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05e7e385-beb4-4e06-8718-fd68e90ba74e-proxy-tls\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458810 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqgs\" (UniqueName: \"kubernetes.io/projected/428dd081-b1bb-404f-856a-f33a1fa7c24a-kube-api-access-5qqgs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458830 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-socket-dir-parent\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458855 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458874 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05e7e385-beb4-4e06-8718-fd68e90ba74e-mcd-auth-proxy-config\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458894 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05e7e385-beb4-4e06-8718-fd68e90ba74e-rootfs\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458911 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-os-release\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.458927 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459059 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdrn\" (UniqueName: \"kubernetes.io/projected/05e7e385-beb4-4e06-8718-fd68e90ba74e-kube-api-access-2mdrn\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459132 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-conf-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459179 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-daemon-config\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.459186 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459207 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btbkk\" (UniqueName: \"kubernetes.io/projected/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-kube-api-access-btbkk\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.459240 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:34.459223591 +0000 UTC m=+22.216298915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459259 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-netns\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459280 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-bin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459299 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-system-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459317 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-k8s-cni-cncf-io\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459339 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459354 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-kubelet\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459370 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cnibin\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459388 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-multus\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459409 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-multus-certs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459439 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459460 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-cnibin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.459477 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-cni-binary-copy\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.459642 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.459713 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:34.459696533 +0000 UTC m=+22.216771857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.470279 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.482370 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 16:37:32 +0000 UTC, rotation deadline is 2026-11-12 05:30:22.958657251 +0000 UTC Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.482460 4694 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6420h47m49.476199631s for next certificate rotation Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.503420 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.525142 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.540419 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.553029 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560405 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560491 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btbkk\" (UniqueName: \"kubernetes.io/projected/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-kube-api-access-btbkk\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560521 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-netns\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560544 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-bin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560564 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-system-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560598 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-k8s-cni-cncf-io\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560637 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560667 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-kubelet\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560691 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cnibin\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560711 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-multus\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560732 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-multus-certs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560763 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-cnibin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560783 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-cni-binary-copy\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560803 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-hostroot\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560825 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-system-cni-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560848 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k98nx\" (UniqueName: \"kubernetes.io/projected/3af5c84a-80ed-47ac-a79d-25b46c8e956e-kube-api-access-k98nx\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560868 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-os-release\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560889 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-hosts-file\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560911 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560931 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560952 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05e7e385-beb4-4e06-8718-fd68e90ba74e-proxy-tls\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560972 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-etc-kubernetes\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.560998 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561020 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqgs\" (UniqueName: \"kubernetes.io/projected/428dd081-b1bb-404f-856a-f33a1fa7c24a-kube-api-access-5qqgs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561052 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05e7e385-beb4-4e06-8718-fd68e90ba74e-mcd-auth-proxy-config\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561073 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-socket-dir-parent\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561101 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05e7e385-beb4-4e06-8718-fd68e90ba74e-rootfs\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561122 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-os-release\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561143 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561164 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdrn\" (UniqueName: \"kubernetes.io/projected/05e7e385-beb4-4e06-8718-fd68e90ba74e-kube-api-access-2mdrn\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561187 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561209 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-conf-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561237 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-daemon-config\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.561968 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-daemon-config\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.562068 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:42:34.562051845 +0000 UTC m=+22.319127169 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.562343 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-netns\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.562383 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-bin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.562431 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-system-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.562468 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-k8s-cni-cncf-io\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563051 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563109 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-kubelet\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563140 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cnibin\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563181 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-var-lib-cni-multus\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563218 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-host-run-multus-certs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563258 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-cnibin\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563711 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428dd081-b1bb-404f-856a-f33a1fa7c24a-cni-binary-copy\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563752 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-hostroot\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.563774 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-system-cni-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.564161 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-os-release\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.564203 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-hosts-file\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.564288 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-cni-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.564695 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-socket-dir-parent\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.564865 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.564903 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.564918 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.564971 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:34.564952039 +0000 UTC m=+22.322027413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565375 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05e7e385-beb4-4e06-8718-fd68e90ba74e-mcd-auth-proxy-config\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.564900 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565429 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-etc-kubernetes\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.565447 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.565459 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565472 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428dd081-b1bb-404f-856a-f33a1fa7c24a-multus-conf-dir\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: E0217 16:42:33.565493 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:34.565483682 +0000 UTC m=+22.322559086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565512 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/05e7e385-beb4-4e06-8718-fd68e90ba74e-rootfs\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565571 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-os-release\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.565719 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af5c84a-80ed-47ac-a79d-25b46c8e956e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.569635 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af5c84a-80ed-47ac-a79d-25b46c8e956e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.569642 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.570259 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05e7e385-beb4-4e06-8718-fd68e90ba74e-proxy-tls\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.589304 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbkk\" (UniqueName: \"kubernetes.io/projected/3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b-kube-api-access-btbkk\") pod \"node-resolver-5rjgs\" (UID: \"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\") " pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.589327 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdrn\" (UniqueName: \"kubernetes.io/projected/05e7e385-beb4-4e06-8718-fd68e90ba74e-kube-api-access-2mdrn\") pod \"machine-config-daemon-b5hgc\" (UID: \"05e7e385-beb4-4e06-8718-fd68e90ba74e\") " pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.601864 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.605535 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k98nx\" (UniqueName: \"kubernetes.io/projected/3af5c84a-80ed-47ac-a79d-25b46c8e956e-kube-api-access-k98nx\") pod \"multus-additional-cni-plugins-d42qm\" (UID: \"3af5c84a-80ed-47ac-a79d-25b46c8e956e\") " pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.607580 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqgs\" (UniqueName: \"kubernetes.io/projected/428dd081-b1bb-404f-856a-f33a1fa7c24a-kube-api-access-5qqgs\") pod \"multus-pj7v4\" (UID: \"428dd081-b1bb-404f-856a-f33a1fa7c24a\") " pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.618706 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.628939 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.646798 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.660857 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.686478 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.698686 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.712497 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.722797 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.735982 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.741537 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5rjgs" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.749007 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.753986 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df13ae5_41cc_4e30_9a22_b3cde3ceeb5b.slice/crio-6dd386059a80d01ed9910a1eb894fcc36a0941cb483c84eab97025f4533f6a75 WatchSource:0}: Error finding container 6dd386059a80d01ed9910a1eb894fcc36a0941cb483c84eab97025f4533f6a75: Status 404 returned error can't find the container with id 6dd386059a80d01ed9910a1eb894fcc36a0941cb483c84eab97025f4533f6a75 Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.760385 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.765831 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fjpm"] Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.767190 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.769892 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770238 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770314 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770436 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770444 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770576 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.770994 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.775729 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.784330 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pj7v4" Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.794818 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428dd081_b1bb_404f_856a_f33a1fa7c24a.slice/crio-1fc63ae1b96d394e317aec4552c70f3ddf9e67899b5146e650c20f6fd7e60db1 WatchSource:0}: Error finding container 1fc63ae1b96d394e317aec4552c70f3ddf9e67899b5146e650c20f6fd7e60db1: Status 404 returned error can't find the container with id 1fc63ae1b96d394e317aec4552c70f3ddf9e67899b5146e650c20f6fd7e60db1 Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.795868 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.806434 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.806749 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.815495 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.818854 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d42qm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.828199 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: W0217 16:42:33.836745 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af5c84a_80ed_47ac_a79d_25b46c8e956e.slice/crio-54aad600c3e10048f1e753bf5fea79a5d0d7be2cc1561858995fb64ceef62dcb WatchSource:0}: Error finding container 54aad600c3e10048f1e753bf5fea79a5d0d7be2cc1561858995fb64ceef62dcb: Status 404 returned error can't find the container with id 54aad600c3e10048f1e753bf5fea79a5d0d7be2cc1561858995fb64ceef62dcb Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.842031 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.855383 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.859799 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:30:08.321901649 +0000 UTC Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.863591 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.863716 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.863811 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.863916 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864108 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864236 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864349 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864400 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpm9k\" (UniqueName: \"kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864466 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864487 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864518 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864539 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864561 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864583 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864627 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864664 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864688 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864711 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864736 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.864760 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.868871 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.880464 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.899429 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.907626 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.930858 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965598 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965750 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965796 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965813 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965827 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965872 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965888 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965908 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965924 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965966 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965981 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.965998 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966036 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966054 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966070 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966091 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966122 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966140 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966161 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966190 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpm9k\" (UniqueName: \"kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966688 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966744 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966767 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.966803 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968196 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968753 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968799 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968846 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968866 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968885 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968908 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968919 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968763 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.968952 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.969108 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.969182 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.969575 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.969713 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.974920 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.985257 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.986082 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpm9k\" (UniqueName: \"kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k\") pod \"ovnkube-node-8fjpm\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:33 crc kubenswrapper[4694]: I0217 16:42:33.999319 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.036549 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.038713 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.040023 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" exitCode=255 Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.040102 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.040199 4694 scope.go:117] "RemoveContainer" containerID="e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.041877 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.041910 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"1df6166f3563ed7de2b3c51b835d1bbfb9ff80fa9ca5e41516a9d97ca203dc7a"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.045915 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerStarted","Data":"3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.045977 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerStarted","Data":"1fc63ae1b96d394e317aec4552c70f3ddf9e67899b5146e650c20f6fd7e60db1"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.047185 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5rjgs" event={"ID":"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b","Type":"ContainerStarted","Data":"e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.047212 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5rjgs" event={"ID":"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b","Type":"ContainerStarted","Data":"6dd386059a80d01ed9910a1eb894fcc36a0941cb483c84eab97025f4533f6a75"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.048598 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ddf14f0cd3d998e79e7685bb7be3c95148a8dcf157e0d826b5f0050973673a9c"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.049775 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerStarted","Data":"54aad600c3e10048f1e753bf5fea79a5d0d7be2cc1561858995fb64ceef62dcb"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.051192 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.051225 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.051238 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5532df28ee3bdcf4e64306392f9f994419845e2b46a792d6047f81a6486b3508"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.052994 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.053035 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"61a6e44e17c06e637d62096a160ed7f92ea16cf763f91c4b7f2599e0b276aa99"} Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.055256 4694 scope.go:117] "RemoveContainer" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.055458 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.056094 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.063003 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.071275 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.079700 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.080226 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.089704 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.098702 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.107592 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.118915 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.152691 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: W0217 16:42:34.157872 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15f1d18_d80a_4fc0_a710_a95c74465b6e.slice/crio-a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3 WatchSource:0}: Error finding container a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3: Status 404 returned error can't find the container with id a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3 Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.189529 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.228543 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.274426 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.316654 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.349502 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.392837 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.429297 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.471298 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.471376 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.471469 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.471521 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:36.471505003 +0000 UTC m=+24.228580337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.471712 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.471824 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:36.47180113 +0000 UTC m=+24.228876504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.472978 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.516260 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.551914 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.572492 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572620 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:42:36.572583382 +0000 UTC m=+24.329658706 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.572687 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.572742 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572852 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572854 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572871 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572875 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572883 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572884 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572929 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:36.57291995 +0000 UTC m=+24.329995274 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.572958 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:36.572950301 +0000 UTC m=+24.330025625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.598620 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.635667 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:26Z\\\",\\\"message\\\":\\\"W0217 16:42:16.061710 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 16:42:16.062256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771346536 cert, and key in /tmp/serving-cert-4231249995/serving-signer.crt, /tmp/serving-cert-4231249995/serving-signer.key\\\\nI0217 16:42:16.558707 1 observer_polling.go:159] Starting file observer\\\\nW0217 16:42:16.560980 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 16:42:16.561187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 16:42:16.568370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4231249995/tls.crt::/tmp/serving-cert-4231249995/tls.key\\\\\\\"\\\\nF0217 16:42:26.776577 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.670709 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.719810 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.752707 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.809452 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.836399 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.860080 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 17:39:17.010100011 +0000 UTC Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.872206 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.894706 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.894826 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.894727 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.894887 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.894715 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:34 crc kubenswrapper[4694]: E0217 16:42:34.894933 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.900952 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.901450 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.902354 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.902983 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.903556 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.904883 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.905456 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.906528 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.907172 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.908053 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.908558 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.909637 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.910175 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.910681 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.910840 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:34Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.911537 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.912066 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.913055 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.913491 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.914035 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.915704 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.916335 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.916999 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.917585 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.918432 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.919819 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.920455 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.921696 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.922223 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.923270 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.923818 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.924323 4694 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.924831 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.926532 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.927071 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.928136 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.929858 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.930801 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.931807 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.932490 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.933602 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.934118 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.935207 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.935974 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.937179 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.937747 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.940331 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.941150 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.942523 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.943210 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.944243 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.944881 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.945565 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.946744 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 16:42:34 crc kubenswrapper[4694]: I0217 16:42:34.947240 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.058048 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb" exitCode=0 Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.058120 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb"} Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.060577 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e"} Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.061899 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.063882 4694 scope.go:117] "RemoveContainer" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" Feb 17 16:42:35 crc kubenswrapper[4694]: E0217 16:42:35.064073 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.064805 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" exitCode=0 Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.064898 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.064971 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3"} Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.075714 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.103886 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.104075 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.116568 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a02c2797d53cdb704c8f35583859d717746b70ce5a57a1c5430cd5306bfafa\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:26Z\\\",\\\"message\\\":\\\"W0217 16:42:16.061710 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 16:42:16.062256 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771346536 cert, and key in /tmp/serving-cert-4231249995/serving-signer.crt, /tmp/serving-cert-4231249995/serving-signer.key\\\\nI0217 16:42:16.558707 1 observer_polling.go:159] Starting file observer\\\\nW0217 16:42:16.560980 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 16:42:16.561187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 16:42:16.568370 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4231249995/tls.crt::/tmp/serving-cert-4231249995/tls.key\\\\\\\"\\\\nF0217 16:42:26.776577 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.133805 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.145652 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.158730 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.195783 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.238169 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.273385 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.313880 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.361059 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.391112 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.434476 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.480171 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.510778 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.550567 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.599200 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.638783 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.693362 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.737813 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.752076 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.792548 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.833678 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.860451 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:58:48.567045673 +0000 UTC Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.873746 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.911921 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.952284 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:35 crc kubenswrapper[4694]: I0217 16:42:35.995085 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:35Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:35.999963 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-r6gvx"] Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.000377 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.024950 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.043431 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.064725 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.070371 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.070420 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.070435 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.070449 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.070461 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.071955 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.073743 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3" exitCode=0 Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.073875 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3"} Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.075128 4694 scope.go:117] "RemoveContainer" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.075538 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.083313 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.090093 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8178c0dd-5081-42ba-ae9d-3384017e0cb8-host\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.090141 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8178c0dd-5081-42ba-ae9d-3384017e0cb8-serviceca\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.090251 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfc4f\" (UniqueName: \"kubernetes.io/projected/8178c0dd-5081-42ba-ae9d-3384017e0cb8-kube-api-access-dfc4f\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.111152 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.152000 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.191026 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfc4f\" (UniqueName: \"kubernetes.io/projected/8178c0dd-5081-42ba-ae9d-3384017e0cb8-kube-api-access-dfc4f\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.191093 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8178c0dd-5081-42ba-ae9d-3384017e0cb8-host\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.191113 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8178c0dd-5081-42ba-ae9d-3384017e0cb8-serviceca\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.193591 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8178c0dd-5081-42ba-ae9d-3384017e0cb8-host\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.193963 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8178c0dd-5081-42ba-ae9d-3384017e0cb8-serviceca\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.202235 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.219977 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfc4f\" (UniqueName: \"kubernetes.io/projected/8178c0dd-5081-42ba-ae9d-3384017e0cb8-kube-api-access-dfc4f\") pod \"node-ca-r6gvx\" (UID: \"8178c0dd-5081-42ba-ae9d-3384017e0cb8\") " pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.251798 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.290253 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.332831 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.372171 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.411267 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.423758 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-r6gvx" Feb 17 16:42:36 crc kubenswrapper[4694]: W0217 16:42:36.441532 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8178c0dd_5081_42ba_ae9d_3384017e0cb8.slice/crio-91813fad7252809cf3c206033cb8ccbc62a4257e1c4d35794e8642381eeaef56 WatchSource:0}: Error finding container 91813fad7252809cf3c206033cb8ccbc62a4257e1c4d35794e8642381eeaef56: Status 404 returned error can't find the container with id 91813fad7252809cf3c206033cb8ccbc62a4257e1c4d35794e8642381eeaef56 Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.453278 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.494355 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.494449 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.494530 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.494575 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.494659 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:40.494599136 +0000 UTC m=+28.251674480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.494686 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:40.494673788 +0000 UTC m=+28.251749122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.498209 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.535829 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.572883 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.595259 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.595378 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595421 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:42:40.595395179 +0000 UTC m=+28.352470503 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595491 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.595492 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595505 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595517 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595558 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:40.595545732 +0000 UTC m=+28.352621046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595647 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595660 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595670 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.595700 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:40.595692606 +0000 UTC m=+28.352767930 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.611185 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.650396 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.695472 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.746246 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:36Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.861127 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:58:50.410276586 +0000 UTC Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.894831 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.894833 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.894983 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.895045 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:36 crc kubenswrapper[4694]: I0217 16:42:36.894859 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:36 crc kubenswrapper[4694]: E0217 16:42:36.895115 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.081116 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315" exitCode=0 Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.081223 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315"} Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.084018 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r6gvx" event={"ID":"8178c0dd-5081-42ba-ae9d-3384017e0cb8","Type":"ContainerStarted","Data":"60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52"} Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.084059 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-r6gvx" event={"ID":"8178c0dd-5081-42ba-ae9d-3384017e0cb8","Type":"ContainerStarted","Data":"91813fad7252809cf3c206033cb8ccbc62a4257e1c4d35794e8642381eeaef56"} Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.091942 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.100110 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.114312 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.137326 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.163745 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.180706 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.205008 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.220845 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.232128 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.244442 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.255787 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.270531 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.285187 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.296938 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.308505 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.335025 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.371742 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.412197 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.454397 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.492870 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.532301 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.572758 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.612628 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.653285 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.702868 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.731548 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.772388 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.812016 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.849833 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.862027 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:22:44.555782255 +0000 UTC Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.911476 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:37 crc kubenswrapper[4694]: I0217 16:42:37.934562 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:37Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.098745 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f" exitCode=0 Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.098811 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f"} Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.118106 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.142053 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.159244 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.170548 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.182337 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.194781 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.210332 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.255532 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.298222 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.334906 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.370107 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.410707 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.450458 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.493679 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.525644 4694 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.527318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.527371 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.527388 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.527548 4694 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.539917 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.564816 4694 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.565129 4694 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.566319 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.566387 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.566408 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.566434 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.566452 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.580350 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.584883 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.584948 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.584972 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.584999 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.585020 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.596929 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.599935 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.599970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.599981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.600001 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.600021 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.610465 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.613807 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.613854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.613865 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.613879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.613889 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.627054 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.631488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.631714 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.631901 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.632093 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.632457 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.645872 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:38Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.646011 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.647309 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.647333 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.647341 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.647354 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.647364 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.750791 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.751152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.751353 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.751544 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.751747 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.856285 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.856338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.856348 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.856367 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.856380 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.862271 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 02:36:36.298072192 +0000 UTC Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.894745 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.894906 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.895160 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.894917 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.895248 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:38 crc kubenswrapper[4694]: E0217 16:42:38.895405 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.959721 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.960086 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.960105 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.960129 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:38 crc kubenswrapper[4694]: I0217 16:42:38.960148 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:38Z","lastTransitionTime":"2026-02-17T16:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.062319 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.062357 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.062368 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.062384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.062395 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.105682 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05" exitCode=0 Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.105787 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.111382 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.120841 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.133703 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.156826 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.164319 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.164354 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.164366 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.164407 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.164430 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.169678 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.189815 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.212318 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.225416 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.245188 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.258428 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.266880 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.266959 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.266975 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.266996 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.267015 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.279324 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.300919 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.314443 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.341132 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.360488 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.370198 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.370233 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.370242 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.370257 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.370267 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.376777 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:39Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.472677 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.472736 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.472754 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.472775 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.472791 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.574564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.574600 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.574638 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.574653 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.574664 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.677017 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.677064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.677075 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.677094 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.677109 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.779287 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.779327 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.779338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.779355 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.779366 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.862887 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:49:31.337303538 +0000 UTC Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.882988 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.883041 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.883054 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.883071 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.883082 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.985463 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.985538 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.985559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.985585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:39 crc kubenswrapper[4694]: I0217 16:42:39.985604 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:39Z","lastTransitionTime":"2026-02-17T16:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.089175 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.089245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.089264 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.089290 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.089307 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.117936 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af5c84a-80ed-47ac-a79d-25b46c8e956e" containerID="fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e" exitCode=0 Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.117986 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerDied","Data":"fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.143725 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.162401 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.182049 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.191566 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.191597 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.191622 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.191635 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.191644 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.204759 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.221498 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.241625 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.254470 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.265455 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.279067 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.290362 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.293770 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.293802 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.293811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.293827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.293836 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.303257 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.316991 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.328534 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.338204 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.350814 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.396541 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.396573 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.396583 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.396597 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.396650 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.498624 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.498656 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.498666 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.498680 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.498691 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.533990 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.534228 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.534334 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.534309362 +0000 UTC m=+36.291384746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.534480 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.534686 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.534764 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.534747133 +0000 UTC m=+36.291822537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.600578 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.600659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.600673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.600691 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.600707 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.635315 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.635428 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635484 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.635463473 +0000 UTC m=+36.392538797 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.635532 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635587 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635627 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635648 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635666 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635680 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635690 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635707 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.635691809 +0000 UTC m=+36.392767153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.635729 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.635718269 +0000 UTC m=+36.392793613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.703105 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.703144 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.703155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.703173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.703184 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.805864 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.805902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.805914 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.805930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.805943 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.863969 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:45:18.558694483 +0000 UTC Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.895392 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.895456 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.895456 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.895562 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.895735 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:40 crc kubenswrapper[4694]: E0217 16:42:40.895798 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.910995 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.911025 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.911033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.911045 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:40 crc kubenswrapper[4694]: I0217 16:42:40.911054 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:40Z","lastTransitionTime":"2026-02-17T16:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.012643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.012709 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.012722 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.012737 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.012750 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.114952 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.114991 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.115002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.115017 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.115027 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.124761 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" event={"ID":"3af5c84a-80ed-47ac-a79d-25b46c8e956e","Type":"ContainerStarted","Data":"d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.130324 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.130653 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.137499 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.148943 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.156033 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.164294 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.176224 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.189641 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.201951 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.214118 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.218193 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.218237 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.218251 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.218269 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.218313 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.226710 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.235313 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.254367 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.271916 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.281513 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.299649 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.310863 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.319936 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.319972 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.319982 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.319995 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.320004 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.321226 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.337104 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.347268 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.358204 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.373285 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.386855 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.405197 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422210 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422469 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422500 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422525 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.422536 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.435710 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.446984 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.459786 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.469410 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.488575 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.504445 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.519417 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.524707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.524739 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.524748 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.524760 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.524769 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.534712 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:41Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.627250 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.627317 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.627335 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.627361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.627379 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.730002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.730089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.730114 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.730145 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.730167 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.833300 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.833352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.833364 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.833382 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.833396 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.864374 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 08:55:11.073343775 +0000 UTC Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.936039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.936110 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.936134 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.936162 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:41 crc kubenswrapper[4694]: I0217 16:42:41.936186 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:41Z","lastTransitionTime":"2026-02-17T16:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.039088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.039127 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.039141 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.039156 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.039167 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.121364 4694 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.134374 4694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.135043 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.141360 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.141431 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.141454 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.141485 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.141510 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.167031 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.182310 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.206563 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.225381 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.234782 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.244334 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.244384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.244396 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.244411 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.244421 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.245825 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.258145 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.271142 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.284081 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.298359 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.314532 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.326340 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.343552 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.347331 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.347367 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.347376 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.347394 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.347404 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.356250 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.370509 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.391702 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.450344 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.450388 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.450396 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.450411 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.450420 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.552858 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.552906 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.552918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.552937 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.552950 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.655229 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.655270 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.655283 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.655301 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.655314 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.757430 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.757499 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.757523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.757553 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.757574 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.860453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.860507 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.860517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.860534 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.860545 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.864631 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:56:52.188260517 +0000 UTC Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.895054 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.895103 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:42 crc kubenswrapper[4694]: E0217 16:42:42.895186 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.895054 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:42 crc kubenswrapper[4694]: E0217 16:42:42.895282 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:42 crc kubenswrapper[4694]: E0217 16:42:42.895499 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.905988 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.914923 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.928969 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.951763 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.962595 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.962663 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.962674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.962691 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.962704 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:42Z","lastTransitionTime":"2026-02-17T16:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.966194 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:42 crc kubenswrapper[4694]: I0217 16:42:42.988352 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.001814 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.011399 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.021375 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.036238 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.051669 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.062928 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.065482 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.065646 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.065737 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.065845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.065923 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.077000 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.087893 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.100188 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.136452 4694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.169528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.169564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.169574 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.169589 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.169601 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.271210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.271240 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.271252 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.271267 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.271277 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.373854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.373891 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.373902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.373918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.373928 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.476061 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.476094 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.476102 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.476116 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.476125 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.557584 4694 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.578346 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.578380 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.578391 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.578405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.578416 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.680585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.680651 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.680666 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.680685 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.680700 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.804150 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.804220 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.804236 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.804263 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.804280 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.864808 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:44:27.421133169 +0000 UTC Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.907368 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.907409 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.907419 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.907432 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:43 crc kubenswrapper[4694]: I0217 16:42:43.907443 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:43Z","lastTransitionTime":"2026-02-17T16:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.010576 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.010957 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.010970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.010986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.010997 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.113944 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.113972 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.113980 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.114025 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.114037 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.142336 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/0.log" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.148114 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e" exitCode=1 Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.148195 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.149205 4694 scope.go:117] "RemoveContainer" containerID="8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.162757 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.175827 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.195102 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:43Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 16:42:43.930057 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 16:42:43.930071 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 16:42:43.930095 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 16:42:43.930104 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 16:42:43.930123 6051 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 16:42:43.930146 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 16:42:43.930148 6051 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 16:42:43.930175 6051 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 16:42:43.930196 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 16:42:43.930206 6051 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 16:42:43.930206 6051 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 16:42:43.930185 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 16:42:43.930261 6051 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 16:42:43.930281 6051 factory.go:656] Stopping watch factory\\\\nI0217 16:42:43.930314 6051 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.207640 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.216062 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.216091 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.216099 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.216114 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.216124 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.218081 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.231347 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.242958 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.262421 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.279797 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.290148 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.299412 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.310431 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.319030 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.319061 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.319071 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.319086 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.319097 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.321470 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.332433 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.344289 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:44Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.422059 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.422124 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.422141 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.422163 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.422181 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.525464 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.525547 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.525572 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.525884 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.525938 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.628948 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.628988 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.629023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.629038 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.629056 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.731099 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.731153 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.731163 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.731176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.731187 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.833473 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.833503 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.833512 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.833524 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.833533 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.865236 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:35:46.586614264 +0000 UTC Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.894714 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.894792 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:44 crc kubenswrapper[4694]: E0217 16:42:44.894811 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.894714 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:44 crc kubenswrapper[4694]: E0217 16:42:44.894915 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:44 crc kubenswrapper[4694]: E0217 16:42:44.895001 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.935754 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.935811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.935838 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.935863 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:44 crc kubenswrapper[4694]: I0217 16:42:44.935878 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:44Z","lastTransitionTime":"2026-02-17T16:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.038159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.038229 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.038248 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.038265 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.038293 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.140307 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.140360 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.140385 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.140403 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.140417 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.152218 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/0.log" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.154984 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.155097 4694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.172543 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.182385 4694 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.186556 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.198532 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.212807 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.227500 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.243112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.243181 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.243204 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.243232 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.243256 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.246321 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.262567 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.278871 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.293503 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.313270 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.337991 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:43Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 16:42:43.930057 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 16:42:43.930071 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 16:42:43.930095 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 16:42:43.930104 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 16:42:43.930123 6051 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 16:42:43.930146 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 16:42:43.930148 6051 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 16:42:43.930175 6051 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 16:42:43.930196 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 16:42:43.930206 6051 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 16:42:43.930206 6051 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 16:42:43.930185 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 16:42:43.930261 6051 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 16:42:43.930281 6051 factory.go:656] Stopping watch factory\\\\nI0217 16:42:43.930314 6051 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.345340 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.345385 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.345403 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.345423 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.345440 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.351039 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.368390 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.383042 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.395963 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.447504 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.447554 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.447570 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.447591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.447628 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.551178 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.551256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.551297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.551327 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.551354 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.654358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.654427 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.654447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.654471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.654489 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.673842 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr"] Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.674379 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.677186 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.679830 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.693140 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.707127 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.727229 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.755875 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:43Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 16:42:43.930057 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 16:42:43.930071 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 16:42:43.930095 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 16:42:43.930104 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 16:42:43.930123 6051 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 16:42:43.930146 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 16:42:43.930148 6051 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 16:42:43.930175 6051 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 16:42:43.930196 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 16:42:43.930206 6051 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 16:42:43.930206 6051 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 16:42:43.930185 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 16:42:43.930261 6051 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 16:42:43.930281 6051 factory.go:656] Stopping watch factory\\\\nI0217 16:42:43.930314 6051 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.757004 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.757040 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.757051 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.757070 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.757083 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.768424 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.787040 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.787125 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.787163 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr7ht\" (UniqueName: \"kubernetes.io/projected/4d99a34c-188b-4df8-9d10-48ca322b8d9f-kube-api-access-zr7ht\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.787230 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.792699 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.807564 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.817168 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.832204 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.847093 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.858800 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.858990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.859015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.859023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.859037 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.859046 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.866194 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 14:32:12.762451092 +0000 UTC Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.872264 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.888271 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.888319 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.888346 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.888364 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr7ht\" (UniqueName: \"kubernetes.io/projected/4d99a34c-188b-4df8-9d10-48ca322b8d9f-kube-api-access-zr7ht\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.889255 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.889918 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.890329 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.895836 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d99a34c-188b-4df8-9d10-48ca322b8d9f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.905892 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.911537 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr7ht\" (UniqueName: \"kubernetes.io/projected/4d99a34c-188b-4df8-9d10-48ca322b8d9f-kube-api-access-zr7ht\") pod \"ovnkube-control-plane-749d76644c-g8vpr\" (UID: \"4d99a34c-188b-4df8-9d10-48ca322b8d9f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.923746 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.942215 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:45Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.961258 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.961292 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.961305 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.961321 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.961333 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:45Z","lastTransitionTime":"2026-02-17T16:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:45 crc kubenswrapper[4694]: I0217 16:42:45.987919 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" Feb 17 16:42:46 crc kubenswrapper[4694]: W0217 16:42:46.009359 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d99a34c_188b_4df8_9d10_48ca322b8d9f.slice/crio-386c36461cb0db1c3690c790896054fc0c58c3c2e9708c2edc81f44b1ed8b04f WatchSource:0}: Error finding container 386c36461cb0db1c3690c790896054fc0c58c3c2e9708c2edc81f44b1ed8b04f: Status 404 returned error can't find the container with id 386c36461cb0db1c3690c790896054fc0c58c3c2e9708c2edc81f44b1ed8b04f Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.063213 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.063266 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.063279 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.063296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.063312 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.160079 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" event={"ID":"4d99a34c-188b-4df8-9d10-48ca322b8d9f","Type":"ContainerStarted","Data":"386c36461cb0db1c3690c790896054fc0c58c3c2e9708c2edc81f44b1ed8b04f"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.162148 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/1.log" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.162869 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/0.log" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.165593 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.165673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.165695 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.165717 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.165733 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.167472 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824" exitCode=1 Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.167511 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.167549 4694 scope.go:117] "RemoveContainer" containerID="8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.168383 4694 scope.go:117] "RemoveContainer" containerID="a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824" Feb 17 16:42:46 crc kubenswrapper[4694]: E0217 16:42:46.168632 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.181878 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.205040 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.223550 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.238496 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.257321 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.268730 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.268761 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.268773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.268811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.268824 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.272658 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.284756 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.297528 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.312741 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.332936 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.348288 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.364664 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.373008 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.373039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.373047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.373059 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.373067 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.389967 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:43Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 16:42:43.930057 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 16:42:43.930071 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 16:42:43.930095 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 16:42:43.930104 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 16:42:43.930123 6051 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 16:42:43.930146 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 16:42:43.930148 6051 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 16:42:43.930175 6051 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 16:42:43.930196 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 16:42:43.930206 6051 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 16:42:43.930206 6051 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 16:42:43.930185 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 16:42:43.930261 6051 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 16:42:43.930281 6051 factory.go:656] Stopping watch factory\\\\nI0217 16:42:43.930314 6051 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.401795 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.412866 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.427035 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:46Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.475183 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.475238 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.475255 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.475277 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.475295 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.577443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.577511 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.577523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.577540 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.577551 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.680975 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.681042 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.681059 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.681083 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.681102 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.784854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.784898 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.784913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.784932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.784948 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.867128 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:04:24.814218738 +0000 UTC Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.887523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.887571 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.887583 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.887601 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.887642 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.895193 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:46 crc kubenswrapper[4694]: E0217 16:42:46.895323 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.895348 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.895678 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:46 crc kubenswrapper[4694]: E0217 16:42:46.895839 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:46 crc kubenswrapper[4694]: E0217 16:42:46.895981 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.989496 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.989542 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.989553 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.989567 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:46 crc kubenswrapper[4694]: I0217 16:42:46.989577 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:46Z","lastTransitionTime":"2026-02-17T16:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.092950 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.092997 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.093010 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.093030 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.093040 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.144862 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4qb4m"] Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.145550 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.145676 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.164672 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.173388 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" event={"ID":"4d99a34c-188b-4df8-9d10-48ca322b8d9f","Type":"ContainerStarted","Data":"b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.173451 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" event={"ID":"4d99a34c-188b-4df8-9d10-48ca322b8d9f","Type":"ContainerStarted","Data":"66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.175776 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/1.log" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.180528 4694 scope.go:117] "RemoveContainer" containerID="a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824" Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.180855 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.190450 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.196087 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.196133 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.196152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.196182 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.196199 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.201889 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.201954 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5nv\" (UniqueName: \"kubernetes.io/projected/974057b2-a009-4d99-8bad-e50b651c8c3c-kube-api-access-nm5nv\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.212865 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.226919 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.243874 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.260483 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.273557 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.291273 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.298180 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.298222 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.298239 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.298260 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.298276 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.302790 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.302837 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5nv\" (UniqueName: \"kubernetes.io/projected/974057b2-a009-4d99-8bad-e50b651c8c3c-kube-api-access-nm5nv\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.302947 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.302994 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:42:47.802978474 +0000 UTC m=+35.560053808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.309961 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.321318 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5nv\" (UniqueName: \"kubernetes.io/projected/974057b2-a009-4d99-8bad-e50b651c8c3c-kube-api-access-nm5nv\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.326170 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.336735 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.348173 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.367098 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.378312 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.395123 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.400406 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.400445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.400461 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.400479 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.400492 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.416107 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d4bd323494a6c40f8a9a0c9584e4907f6b2e3d1ad18b52595edf2167d6d180e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:43Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 16:42:43.930057 6051 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 16:42:43.930071 6051 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 16:42:43.930095 6051 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 16:42:43.930104 6051 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 16:42:43.930123 6051 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 16:42:43.930146 6051 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 16:42:43.930148 6051 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 16:42:43.930175 6051 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 16:42:43.930196 6051 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 16:42:43.930206 6051 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 16:42:43.930206 6051 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 16:42:43.930185 6051 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 16:42:43.930261 6051 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 16:42:43.930281 6051 factory.go:656] Stopping watch factory\\\\nI0217 16:42:43.930314 6051 ovnkube.go:599] Stopped ovnkube\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.427693 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.456491 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.468897 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.479376 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.493259 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.502780 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.502808 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.502818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.502830 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.502838 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.506128 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.521026 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.536633 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.552150 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.564567 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.577066 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.588742 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.601078 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.605064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.605111 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.605128 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.605151 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.605169 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.613368 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.635718 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.666259 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.683235 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.702769 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:47Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.707349 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.707378 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.707388 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.707405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.707418 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.807356 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.807560 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:47 crc kubenswrapper[4694]: E0217 16:42:47.808055 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:42:48.808022282 +0000 UTC m=+36.565097646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.809523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.809689 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.809800 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.809895 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.809980 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.868052 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 04:48:02.543600938 +0000 UTC Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.913060 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.913113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.913133 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.913160 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:47 crc kubenswrapper[4694]: I0217 16:42:47.913183 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:47Z","lastTransitionTime":"2026-02-17T16:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.016352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.016420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.016443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.016472 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.016495 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.120522 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.120658 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.120679 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.120710 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.120734 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.223468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.223512 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.223523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.223540 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.223552 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.325949 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.325986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.325998 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.326014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.326026 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.569993 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.570097 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.570223 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.570294 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:04.570275278 +0000 UTC m=+52.327350602 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.571010 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.571040 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:04.571030987 +0000 UTC m=+52.328106311 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.576200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.576248 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.576261 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.576280 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.576298 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.671468 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.671627 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:43:04.671585729 +0000 UTC m=+52.428661053 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.671932 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.672064 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672109 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672322 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672185 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672420 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672433 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672483 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:04.672471751 +0000 UTC m=+52.429547075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672399 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.672737 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:04.672723797 +0000 UTC m=+52.429799121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.679544 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.679578 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.679591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.679638 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.679654 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.782499 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.782540 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.782549 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.782564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.782575 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.868357 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:47:31.153324399 +0000 UTC Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.874081 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.874253 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.874342 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:42:50.874319561 +0000 UTC m=+38.631394895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.885823 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.885901 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.885926 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.885956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.885980 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.895421 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.895511 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.895522 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.895693 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.895701 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.895840 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.895984 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:48 crc kubenswrapper[4694]: E0217 16:42:48.899379 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.988568 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.988635 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.988646 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.988661 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:48 crc kubenswrapper[4694]: I0217 16:42:48.988672 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:48Z","lastTransitionTime":"2026-02-17T16:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.007578 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.007640 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.007652 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.007671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.007683 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.022048 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:49Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.026081 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.026111 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.026124 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.026141 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.026152 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.041901 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:49Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.046239 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.046284 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.046298 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.046319 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.046334 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.059465 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:49Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.063348 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.063416 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.063428 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.063441 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.063452 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.077237 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:49Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.081131 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.081176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.081194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.081215 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.081229 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.092274 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:49Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:49 crc kubenswrapper[4694]: E0217 16:42:49.092440 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.094112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.094149 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.094162 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.094177 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.094189 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.196579 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.196668 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.196686 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.196709 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.196728 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.298875 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.299228 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.299472 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.299786 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.299983 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.403036 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.403066 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.403073 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.403086 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.403094 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.505769 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.505825 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.505843 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.505865 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.505884 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.608414 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.608464 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.608473 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.608490 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.608513 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.710865 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.710915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.710932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.710954 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.710971 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.813904 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.814426 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.814643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.814903 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.815103 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.869412 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:44:22.84159059 +0000 UTC Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.895993 4694 scope.go:117] "RemoveContainer" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.917578 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.917665 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.917684 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.917708 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:49 crc kubenswrapper[4694]: I0217 16:42:49.917726 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:49Z","lastTransitionTime":"2026-02-17T16:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.020585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.020859 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.020871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.020884 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.020893 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.122774 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.122832 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.122850 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.122875 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.122895 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.189743 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.191173 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.191763 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.208135 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225165 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225631 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225702 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225724 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225746 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.225763 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.249439 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.264525 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.284258 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.300193 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.316205 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.327456 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.328309 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.328364 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.328384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.328405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.328420 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.337593 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.350262 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.362981 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.373326 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.385838 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.404272 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.414559 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.425072 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.430405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.430443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.430454 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.430468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.430479 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.437138 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:50Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.534085 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.534435 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.534599 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.534849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.535022 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.638015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.638066 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.638082 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.638104 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.638119 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.741291 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.741336 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.741351 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.741370 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.741384 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.843707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.843743 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.843751 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.843768 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.843778 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.875307 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:00:21.526339894 +0000 UTC Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.892010 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.892188 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.892298 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:42:54.892272155 +0000 UTC m=+42.649347519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.894927 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.895005 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.895110 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.895150 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.895193 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.895267 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.895354 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:50 crc kubenswrapper[4694]: E0217 16:42:50.895556 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.945854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.945892 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.945900 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.945913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:50 crc kubenswrapper[4694]: I0217 16:42:50.945923 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:50Z","lastTransitionTime":"2026-02-17T16:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.051164 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.051409 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.051515 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.051622 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.051709 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.154921 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.154970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.154986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.155007 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.155022 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.257748 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.257797 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.257812 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.257835 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.257849 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.359700 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.359740 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.359754 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.359767 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.359775 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.462389 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.462488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.462506 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.462530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.462547 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.565022 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.565246 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.565313 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.565392 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.565449 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.669073 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.669140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.669164 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.669193 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.669265 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.771021 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.771051 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.771060 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.771073 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.771083 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.874090 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.874155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.874174 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.874201 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.874219 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.876100 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:21:35.984528806 +0000 UTC Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.975871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.976082 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.976281 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.976445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:51 crc kubenswrapper[4694]: I0217 16:42:51.976554 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:51Z","lastTransitionTime":"2026-02-17T16:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.078456 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.078528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.078551 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.078580 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.078600 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.181581 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.181689 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.181715 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.181751 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.181775 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.284746 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.284806 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.284820 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.284843 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.284860 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.387511 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.387794 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.387909 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.388028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.388140 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.490956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.491033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.491046 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.491064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.491079 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.593575 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.593656 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.593674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.593698 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.593713 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.697103 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.697159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.697177 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.697200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.697219 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.800827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.800870 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.800887 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.800908 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.800924 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.876996 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:07:57.658147829 +0000 UTC Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.894499 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.894592 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.894761 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:52 crc kubenswrapper[4694]: E0217 16:42:52.894780 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.894975 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:52 crc kubenswrapper[4694]: E0217 16:42:52.895035 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:52 crc kubenswrapper[4694]: E0217 16:42:52.895162 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:52 crc kubenswrapper[4694]: E0217 16:42:52.895294 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.903816 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.903863 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.903878 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.903899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.903916 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:52Z","lastTransitionTime":"2026-02-17T16:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.919798 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:52Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.937467 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:52Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.955809 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:52Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.975958 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:52Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:52 crc kubenswrapper[4694]: I0217 16:42:52.993604 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:52Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.007018 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.007087 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.007108 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.007139 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.007163 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.013586 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.029985 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.048289 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.059468 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.076732 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.103728 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.109589 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.109770 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.109874 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.109990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.110104 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.120818 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.137780 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.157344 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.174182 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.187639 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.210727 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:53Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.212420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.212472 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.212492 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.212517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.212536 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.315084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.315346 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.315510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.315598 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.315710 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.418654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.418990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.419018 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.419041 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.419057 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.521760 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.522391 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.522552 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.522863 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.523069 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.627467 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.627915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.628138 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.628358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.628536 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.731202 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.731243 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.731260 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.731279 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.731293 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.834021 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.834374 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.834514 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.834703 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.834866 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.878454 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:59:09.687488061 +0000 UTC Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.939857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.939915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.939933 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.939956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:53 crc kubenswrapper[4694]: I0217 16:42:53.940160 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:53Z","lastTransitionTime":"2026-02-17T16:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.042467 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.042550 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.042574 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.042601 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.042650 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.144752 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.144812 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.144829 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.144853 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.144870 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.247294 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.247337 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.247352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.247425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.247441 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.351030 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.351100 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.351122 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.351147 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.351164 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.454110 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.454176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.454194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.454216 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.454233 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.556925 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.557009 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.557035 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.557099 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.557123 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.660121 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.660161 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.660173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.660190 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.660201 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.762459 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.762546 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.762569 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.762599 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.762667 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.865818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.865891 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.865917 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.865949 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.865972 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.879285 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:00:55.448852204 +0000 UTC Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.894726 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.894766 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.894923 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.894968 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.894949 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.895077 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.895207 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.895492 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.930925 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.931353 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:54 crc kubenswrapper[4694]: E0217 16:42:54.931555 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:43:02.931531269 +0000 UTC m=+50.688606623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.968743 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.968811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.968827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.968849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:54 crc kubenswrapper[4694]: I0217 16:42:54.968864 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:54Z","lastTransitionTime":"2026-02-17T16:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.071779 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.072381 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.072567 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.073010 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.073203 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.175786 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.176062 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.176184 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.176293 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.176389 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.278631 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.278657 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.278665 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.278678 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.278686 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.372806 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.373903 4694 scope.go:117] "RemoveContainer" containerID="a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.381160 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.381203 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.381218 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.381238 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.381254 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.484851 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.484889 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.484898 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.484913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.484924 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.589107 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.589226 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.589246 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.589296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.589308 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.692420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.692489 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.692513 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.692560 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.692585 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.816270 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.816312 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.816327 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.816347 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.816363 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.879984 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 21:15:00.106923171 +0000 UTC Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.919111 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.919148 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.919162 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.919177 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:55 crc kubenswrapper[4694]: I0217 16:42:55.919188 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:55Z","lastTransitionTime":"2026-02-17T16:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.021862 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.021919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.021932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.021951 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.021964 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.124220 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.124279 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.124297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.124335 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.124355 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.217698 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/1.log" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.220394 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.221223 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.226444 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.226518 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.226530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.226545 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.226556 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.245408 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.270728 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.289497 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.302852 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.316904 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.329812 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.329874 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.330023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.330039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.330065 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.330080 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.340286 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.353764 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.367599 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.382423 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.394974 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.411554 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.424683 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.432335 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.432425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.432438 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.432456 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.432467 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.438282 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.449535 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.463866 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.482250 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:56Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.535037 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.535078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.535087 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.535101 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.535111 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.637924 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.637953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.637962 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.637974 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.637983 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.740867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.740911 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.740921 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.740938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.740950 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.843909 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.843941 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.843953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.843967 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.843977 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.880730 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:29:13.721128159 +0000 UTC Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.895124 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:56 crc kubenswrapper[4694]: E0217 16:42:56.895345 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.895389 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.895430 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:56 crc kubenswrapper[4694]: E0217 16:42:56.895463 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:56 crc kubenswrapper[4694]: E0217 16:42:56.895500 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.895650 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:56 crc kubenswrapper[4694]: E0217 16:42:56.895713 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.946828 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.946877 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.946889 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.946906 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:56 crc kubenswrapper[4694]: I0217 16:42:56.946917 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:56Z","lastTransitionTime":"2026-02-17T16:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.049846 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.049890 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.049907 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.049923 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.049934 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.152714 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.152794 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.152817 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.152847 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.152869 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.225849 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/2.log" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.226572 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/1.log" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.230143 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" exitCode=1 Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.230289 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.230396 4694 scope.go:117] "RemoveContainer" containerID="a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.231249 4694 scope.go:117] "RemoveContainer" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" Feb 17 16:42:57 crc kubenswrapper[4694]: E0217 16:42:57.231730 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.247668 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.255229 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.255318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.255338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.255401 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.255422 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.263374 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.275565 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.288250 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.302809 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.322196 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.335877 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.348898 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.357882 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.357921 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.357934 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.357950 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.357962 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.369743 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a48cd36de1f85422dcc629686f9d5073f56fe3977d8192d405f8c02e1e59c824\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"message\\\":\\\"uster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.213\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0217 16:42:45.078167 6188 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-r6gvx\\\\nI0217 16:42:45.078176 6188 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-r6gvx in node crc\\\\nF0217 16:42:45.078177 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet val\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.381221 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.393130 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.404516 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.426249 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.444021 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.461415 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.461745 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.462012 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.462188 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.462346 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.471511 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.490337 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.501741 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:57Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.564885 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.564990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.565015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.565043 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.565063 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.667465 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.667794 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.667882 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.667970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.668076 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.770900 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.770936 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.770949 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.770964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.770974 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.873777 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.873827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.873844 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.873871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.873885 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.881182 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 19:02:19.550812581 +0000 UTC Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.977287 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.977333 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.977345 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.977361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:57 crc kubenswrapper[4694]: I0217 16:42:57.977372 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:57Z","lastTransitionTime":"2026-02-17T16:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.080219 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.080256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.080267 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.080284 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.080296 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.182918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.182963 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.182973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.182988 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.182998 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.238197 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/2.log" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.242500 4694 scope.go:117] "RemoveContainer" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" Feb 17 16:42:58 crc kubenswrapper[4694]: E0217 16:42:58.242698 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.257550 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.271290 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.285986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.286029 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.286041 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.286058 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.286071 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.287718 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.304694 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.317684 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.337100 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.357132 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.371850 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.382488 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.387803 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.387855 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.387868 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.387888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.387900 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.396493 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.414504 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.427120 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.439475 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.454453 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.472820 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.485464 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.490150 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.490192 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.490204 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.490232 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.490244 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.504447 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:58Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.593369 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.593685 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.593794 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.593926 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.594062 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.696547 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.696679 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.696703 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.696736 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.696759 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.799568 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.799720 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.799791 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.799818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.799836 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.881585 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 08:44:01.504000906 +0000 UTC Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.895716 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.895716 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.895734 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.895863 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:42:58 crc kubenswrapper[4694]: E0217 16:42:58.895945 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:42:58 crc kubenswrapper[4694]: E0217 16:42:58.896273 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:42:58 crc kubenswrapper[4694]: E0217 16:42:58.896323 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:42:58 crc kubenswrapper[4694]: E0217 16:42:58.896377 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.904149 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.904205 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.904225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.904252 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:58 crc kubenswrapper[4694]: I0217 16:42:58.904274 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:58Z","lastTransitionTime":"2026-02-17T16:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.006356 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.006386 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.006393 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.006406 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.006416 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.108973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.109012 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.109022 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.109035 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.109046 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.211427 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.211486 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.211502 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.211525 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.211543 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.314454 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.314488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.314501 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.314517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.314529 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.378047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.378093 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.378113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.378137 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.378154 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.399346 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:59Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.404532 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.404734 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.404820 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.404909 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.405013 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.425203 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:59Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.429433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.429494 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.429512 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.429537 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.429555 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.446036 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:59Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.451970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.452023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.452039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.452063 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.452080 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.467925 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:59Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.473671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.473779 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.473799 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.473822 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.473837 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.491349 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:42:59Z is after 2025-08-24T17:21:41Z" Feb 17 16:42:59 crc kubenswrapper[4694]: E0217 16:42:59.492179 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.498871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.498917 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.498934 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.498957 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.498975 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.601553 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.601603 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.601649 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.601678 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.601695 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.704002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.704063 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.704081 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.704104 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.704123 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.806510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.806584 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.806638 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.806671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.806693 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.882999 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:49:25.626042308 +0000 UTC Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.910249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.910300 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.910311 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.910328 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:42:59 crc kubenswrapper[4694]: I0217 16:42:59.910342 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:42:59Z","lastTransitionTime":"2026-02-17T16:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.012730 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.012797 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.012816 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.012841 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.012858 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.115785 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.115817 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.115843 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.115859 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.115867 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.219146 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.219202 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.219219 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.219241 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.219258 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.322140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.322215 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.322239 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.322267 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.322289 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.425338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.425398 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.425419 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.425439 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.425454 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.528262 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.528322 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.528339 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.528361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.528379 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.631882 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.631923 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.631935 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.631953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.631964 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.734084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.734151 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.734169 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.734196 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.734217 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.837367 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.837430 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.837442 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.837458 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.837472 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.883350 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 15:53:25.944791534 +0000 UTC Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.895066 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.895157 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.895189 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:00 crc kubenswrapper[4694]: E0217 16:43:00.895415 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.895471 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:00 crc kubenswrapper[4694]: E0217 16:43:00.895690 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:00 crc kubenswrapper[4694]: E0217 16:43:00.895798 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:00 crc kubenswrapper[4694]: E0217 16:43:00.895909 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.940055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.940365 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.940511 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.940707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:00 crc kubenswrapper[4694]: I0217 16:43:00.940943 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:00Z","lastTransitionTime":"2026-02-17T16:43:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.044088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.044132 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.044146 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.044165 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.044179 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.146983 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.147028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.147043 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.147064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.147077 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.250655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.250717 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.250740 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.250770 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.250797 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.352941 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.353008 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.353025 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.353049 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.353065 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.455321 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.455395 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.455426 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.455458 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.455481 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.557282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.557340 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.557351 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.557367 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.557378 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.660245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.660304 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.660320 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.660342 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.660359 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.763596 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.763651 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.763664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.763680 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.763692 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.866373 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.866412 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.866424 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.866442 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.866454 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.883988 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:25:59.780742735 +0000 UTC Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.973707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.973744 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.973755 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.973770 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:01 crc kubenswrapper[4694]: I0217 16:43:01.973780 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:01Z","lastTransitionTime":"2026-02-17T16:43:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.077167 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.077256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.077447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.077474 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.077489 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.180494 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.180683 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.180711 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.180744 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.180770 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.283459 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.283732 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.283883 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.284013 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.284114 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.387104 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.387178 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.387196 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.387222 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.387240 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.490048 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.490443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.490556 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.490643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.490704 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.593490 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.593938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.594030 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.594097 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.594155 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.698015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.698064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.698073 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.698089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.698099 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.801369 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.801415 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.801426 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.801449 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.801470 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.884733 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:51:13.54791659 +0000 UTC Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.895483 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.895733 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.895734 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:02 crc kubenswrapper[4694]: E0217 16:43:02.895937 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.895968 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:02 crc kubenswrapper[4694]: E0217 16:43:02.896068 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:02 crc kubenswrapper[4694]: E0217 16:43:02.896127 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:02 crc kubenswrapper[4694]: E0217 16:43:02.896236 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.903390 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.903436 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.903453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.903477 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.903492 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:02Z","lastTransitionTime":"2026-02-17T16:43:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.914223 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:02Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.944149 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:02Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.960277 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:02Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.976884 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:02Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:02 crc kubenswrapper[4694]: I0217 16:43:02.994662 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:02Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.005883 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.005932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.005944 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.005965 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.005978 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.013816 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.024975 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:03 crc kubenswrapper[4694]: E0217 16:43:03.025118 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:03 crc kubenswrapper[4694]: E0217 16:43:03.025567 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:43:19.025547291 +0000 UTC m=+66.782622615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.030136 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.050602 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.068898 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.083444 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.096429 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.108110 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.108140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.108152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.108171 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.108184 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.113647 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.131531 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.144500 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.156742 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.174706 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.187808 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:03Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.210323 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.210368 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.210383 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.210404 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.210419 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.313822 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.313899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.313925 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.313958 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.313982 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.416848 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.417076 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.417220 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.417324 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.417458 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.520281 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.520361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.520378 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.520402 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.520417 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.623412 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.623481 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.623496 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.623519 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.623536 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.726109 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.726145 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.726155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.726169 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.726179 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.828986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.829060 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.829081 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.829110 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.829130 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.884885 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:45:43.319488826 +0000 UTC Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.931445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.931819 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.932028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.932241 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:03 crc kubenswrapper[4694]: I0217 16:43:03.932926 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:03Z","lastTransitionTime":"2026-02-17T16:43:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.036332 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.036393 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.036410 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.036433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.036452 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.138837 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.138886 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.138897 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.138915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.138929 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.148645 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.169684 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.185326 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.195118 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.210434 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.223813 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.235408 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.241774 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.241848 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.241866 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.241888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.241903 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.250121 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.264204 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.275465 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.289403 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.302974 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.316508 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.329052 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.341155 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.344894 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.344945 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.344957 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.344975 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.344995 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.372040 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.380930 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.391371 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:04Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.448670 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.448733 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.448751 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.448777 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.448797 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.552511 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.552604 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.552673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.552701 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.552733 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.642766 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.642871 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.642922 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.643050 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:36.643018814 +0000 UTC m=+84.400094178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.643147 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.643281 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:36.64325283 +0000 UTC m=+84.400328194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.655562 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.655632 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.655645 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.655663 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.655675 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.743977 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744244 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:43:36.744203366 +0000 UTC m=+84.501278720 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.744495 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.744563 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744742 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744780 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744777 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744800 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744818 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744835 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744867 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:36.744851943 +0000 UTC m=+84.501927297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.744910 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:43:36.744880614 +0000 UTC m=+84.501956048 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.758296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.758340 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.758352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.758368 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.758380 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.861159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.861291 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.861430 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.861453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.861472 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.885736 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 23:26:41.71277584 +0000 UTC Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.895267 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.895361 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.895440 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.895526 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.895697 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.895746 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.895829 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:04 crc kubenswrapper[4694]: E0217 16:43:04.895949 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.965174 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.965237 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.965261 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.965290 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:04 crc kubenswrapper[4694]: I0217 16:43:04.965312 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:04Z","lastTransitionTime":"2026-02-17T16:43:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.069098 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.069152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.069166 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.069189 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.069202 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.172444 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.172514 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.172537 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.172563 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.172582 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.266367 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.276207 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.276885 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.277206 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.277399 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.277595 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.282692 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.305867 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.328874 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.348128 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.360944 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.372004 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.380087 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.380129 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.380144 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.380164 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.380182 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.386460 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.406718 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.423180 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.444115 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.460874 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.479107 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.483280 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.483318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.483329 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.483347 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.483358 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.492135 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.502762 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.518228 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.538700 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.550954 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.561693 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:05Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.586282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.586426 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.586455 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.586481 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.586498 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.689729 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.689762 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.689773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.689788 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.689798 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.792538 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.792595 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.792630 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.792653 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.792670 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.886452 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 02:42:52.340220525 +0000 UTC Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.895249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.895410 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.895501 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.895563 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.895645 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.998216 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.998294 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.998314 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.998352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:05 crc kubenswrapper[4694]: I0217 16:43:05.998370 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:05Z","lastTransitionTime":"2026-02-17T16:43:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.101917 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.101964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.101981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.102005 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.102021 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.205144 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.205225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.205249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.205296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.205319 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.308320 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.308364 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.308420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.308437 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.308449 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.410819 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.410857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.410868 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.410883 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.410894 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.513227 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.513278 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.513295 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.513319 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.513336 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.616206 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.616269 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.616291 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.616317 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.616335 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.719122 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.719381 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.719462 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.719570 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.719669 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.822775 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.822840 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.822855 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.822875 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.822891 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.887504 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 05:22:33.533080917 +0000 UTC Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.894987 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.894989 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.895008 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.895122 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:06 crc kubenswrapper[4694]: E0217 16:43:06.895219 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:06 crc kubenswrapper[4694]: E0217 16:43:06.895356 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:06 crc kubenswrapper[4694]: E0217 16:43:06.895533 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:06 crc kubenswrapper[4694]: E0217 16:43:06.895750 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.925620 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.925659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.925668 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.925681 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:06 crc kubenswrapper[4694]: I0217 16:43:06.925691 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:06Z","lastTransitionTime":"2026-02-17T16:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.028490 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.028525 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.028534 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.028548 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.028558 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.131775 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.131814 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.131825 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.131840 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.131851 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.234463 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.234510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.234519 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.234554 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.234566 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.336674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.336733 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.336750 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.336774 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.336793 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.440199 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.440326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.440343 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.440366 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.440384 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.543304 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.543363 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.543384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.543400 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.543409 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.645881 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.645935 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.645952 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.645975 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.645993 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.748306 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.748365 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.748382 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.748404 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.748421 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.850673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.850738 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.850756 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.850779 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.850796 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.888299 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 07:15:06.218551577 +0000 UTC Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.953962 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.954055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.954077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.954105 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:07 crc kubenswrapper[4694]: I0217 16:43:07.954127 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:07Z","lastTransitionTime":"2026-02-17T16:43:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.057584 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.057689 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.057715 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.057745 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.057809 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.160974 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.161055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.161078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.161108 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.161131 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.264293 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.264354 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.264366 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.264382 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.264414 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.367498 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.367559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.367583 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.367643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.367668 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.471918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.471993 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.472003 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.472019 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.472031 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.575968 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.576047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.576072 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.576102 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.576124 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.678659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.678731 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.678747 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.678771 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.678794 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.782003 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.782051 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.782062 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.782078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.782090 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.884714 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.884776 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.884795 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.884818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.884837 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.889267 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 23:21:13.948965727 +0000 UTC Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.894827 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.894881 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.894953 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:08 crc kubenswrapper[4694]: E0217 16:43:08.895129 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.895203 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:08 crc kubenswrapper[4694]: E0217 16:43:08.895413 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:08 crc kubenswrapper[4694]: E0217 16:43:08.895529 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:08 crc kubenswrapper[4694]: E0217 16:43:08.895659 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.987896 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.987984 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.988691 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.988773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:08 crc kubenswrapper[4694]: I0217 16:43:08.989096 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:08Z","lastTransitionTime":"2026-02-17T16:43:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.092214 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.092270 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.092287 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.092310 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.092327 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.194035 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.194072 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.194084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.194102 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.194114 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.296926 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.297004 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.297023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.297047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.297066 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.400875 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.400957 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.400979 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.401011 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.401033 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.504038 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.504113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.504132 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.504159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.504178 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.606979 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.607038 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.607052 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.607075 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.607091 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.710750 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.710812 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.710835 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.710858 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.710876 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.714245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.714419 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.714537 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.714697 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.714830 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.731151 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:09Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.740522 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.740561 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.740572 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.740587 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.740598 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.755259 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:09Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.759021 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.759077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.759093 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.759116 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.759132 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.772413 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:09Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.776804 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.776853 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.776869 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.776888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.776903 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.790357 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:09Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.794246 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.794293 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.794303 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.794318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.794327 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.807159 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:09Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:09 crc kubenswrapper[4694]: E0217 16:43:09.807359 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.814054 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.814103 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.814113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.814127 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.814137 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.890250 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:59:15.639873224 +0000 UTC Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.917241 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.917315 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.917349 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.917376 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:09 crc kubenswrapper[4694]: I0217 16:43:09.917397 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:09Z","lastTransitionTime":"2026-02-17T16:43:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.019489 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.019545 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.019556 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.019569 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.019577 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.121953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.122000 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.122015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.122039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.122055 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.224530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.224598 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.224654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.224701 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.224726 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.326548 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.326582 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.326592 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.326618 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.326628 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.429475 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.429813 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.429915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.430039 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.430154 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.533394 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.533811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.534083 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.534297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.534722 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.638543 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.639001 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.639047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.639077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.639099 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.742475 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.742521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.742538 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.742559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.742597 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.845952 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.846002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.846013 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.846053 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.846065 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.891169 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:43:17.381851033 +0000 UTC Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.894599 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:10 crc kubenswrapper[4694]: E0217 16:43:10.894803 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.894907 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.894957 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.894909 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:10 crc kubenswrapper[4694]: E0217 16:43:10.895069 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:10 crc kubenswrapper[4694]: E0217 16:43:10.895174 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:10 crc kubenswrapper[4694]: E0217 16:43:10.895284 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.896500 4694 scope.go:117] "RemoveContainer" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" Feb 17 16:43:10 crc kubenswrapper[4694]: E0217 16:43:10.896809 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.949680 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.949761 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.949779 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.949845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:10 crc kubenswrapper[4694]: I0217 16:43:10.949862 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:10Z","lastTransitionTime":"2026-02-17T16:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.053107 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.053175 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.053200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.053229 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.053251 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.155985 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.156040 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.156053 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.156071 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.156084 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.259403 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.259453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.259465 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.259485 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.259499 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.362470 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.363160 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.363308 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.363465 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.363819 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.466903 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.467447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.467575 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.467720 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.467816 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.570402 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.570471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.570488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.570510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.570529 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.672929 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.673028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.673480 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.673559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.673888 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.776443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.776526 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.776544 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.776570 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.776588 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.879939 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.880003 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.880027 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.880056 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.880078 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.892025 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 03:37:34.702677921 +0000 UTC Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.982581 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.982645 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.982658 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.982676 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:11 crc kubenswrapper[4694]: I0217 16:43:11.982688 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:11Z","lastTransitionTime":"2026-02-17T16:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.085348 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.085404 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.085421 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.085445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.085461 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.189076 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.189123 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.189139 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.189163 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.189181 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.291203 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.291256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.291277 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.291303 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.291322 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.393905 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.393970 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.393989 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.394015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.394034 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.496232 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.496273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.496282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.496297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.496307 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.598834 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.598899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.598916 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.598938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.598957 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.702343 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.702390 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.702399 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.702415 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.702426 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.805248 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.805655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.805893 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.806156 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.806371 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.892281 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:34:02.300163412 +0000 UTC Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.895293 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:12 crc kubenswrapper[4694]: E0217 16:43:12.895461 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.895794 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:12 crc kubenswrapper[4694]: E0217 16:43:12.896486 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.896578 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.896970 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:12 crc kubenswrapper[4694]: E0217 16:43:12.897056 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:12 crc kubenswrapper[4694]: E0217 16:43:12.899404 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.908379 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.908455 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.908467 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.908589 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.908648 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:12Z","lastTransitionTime":"2026-02-17T16:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.929082 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:12Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.949147 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:12Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.961356 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:12Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.977426 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:12Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:12 crc kubenswrapper[4694]: I0217 16:43:12.996591 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:12Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.011229 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.011478 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.011672 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.011834 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.011952 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.017682 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.036294 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.051095 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.063867 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.073851 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.084825 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.095974 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.110469 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.114271 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.114318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.114336 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.114358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.114400 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.121971 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.137359 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.158461 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.169961 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.182071 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:13Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.216027 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.216062 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.216077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.216124 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.216139 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.318920 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.318965 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.318979 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.318995 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.319008 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.422395 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.422484 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.422504 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.422525 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.422543 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.525358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.525747 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.525864 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.525939 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.526008 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.627942 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.627984 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.627997 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.628014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.628025 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.730644 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.730684 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.730695 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.730713 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.730727 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.833370 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.833405 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.833413 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.833425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.833435 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.893398 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:05:55.523049588 +0000 UTC Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.935453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.935493 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.935504 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.935517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:13 crc kubenswrapper[4694]: I0217 16:43:13.935526 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:13Z","lastTransitionTime":"2026-02-17T16:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.038228 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.038285 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.038307 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.038334 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.038356 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.140544 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.140856 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.140981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.141152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.141297 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.244910 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.245374 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.245397 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.245422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.245440 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.348699 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.348742 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.348756 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.348774 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.348786 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.451785 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.451821 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.451831 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.451849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.451868 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.557282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.557728 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.557988 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.558526 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.558827 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.661930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.662009 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.662033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.662065 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.662086 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.764740 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.764803 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.764821 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.764845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.764871 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.868087 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.868402 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.868531 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.868864 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.869038 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.894490 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 23:51:00.656133519 +0000 UTC Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.894580 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.894625 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.894687 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.894776 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:14 crc kubenswrapper[4694]: E0217 16:43:14.895906 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:14 crc kubenswrapper[4694]: E0217 16:43:14.895924 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:14 crc kubenswrapper[4694]: E0217 16:43:14.896121 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:14 crc kubenswrapper[4694]: E0217 16:43:14.896238 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.971664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.971717 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.971727 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.971740 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:14 crc kubenswrapper[4694]: I0217 16:43:14.971749 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:14Z","lastTransitionTime":"2026-02-17T16:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.074449 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.074518 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.074541 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.074569 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.074591 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.177871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.177939 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.177956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.177980 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.177999 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.280213 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.280256 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.280298 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.280325 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.280341 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.382476 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.382520 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.382530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.382544 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.382554 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.485079 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.485113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.485140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.485158 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.485170 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.591373 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.591469 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.591498 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.591531 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.591560 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.694930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.695010 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.695362 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.695432 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.695451 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.798711 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.798777 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.798800 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.798829 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.798850 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.895736 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:50:41.627442382 +0000 UTC Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.902630 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.902660 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.902673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.902689 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:15 crc kubenswrapper[4694]: I0217 16:43:15.902707 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:15Z","lastTransitionTime":"2026-02-17T16:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.004957 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.004994 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.005005 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.005023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.005037 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.107850 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.107964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.107981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.108384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.108440 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.210553 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.210602 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.210638 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.210655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.210668 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.312891 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.313300 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.313434 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.313566 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.313740 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.415899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.416251 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.416375 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.416519 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.416682 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.519782 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.519836 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.519856 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.519879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.519897 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.622878 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.622909 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.622918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.622930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.622939 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.726090 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.726488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.726671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.726831 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.726959 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.829704 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.830033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.830299 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.830664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.830938 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.894766 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:16 crc kubenswrapper[4694]: E0217 16:43:16.894936 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.895117 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.895244 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:16 crc kubenswrapper[4694]: E0217 16:43:16.895469 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.895499 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:16 crc kubenswrapper[4694]: E0217 16:43:16.895829 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.895973 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:12:43.470138528 +0000 UTC Feb 17 16:43:16 crc kubenswrapper[4694]: E0217 16:43:16.895590 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.934431 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.934781 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.934969 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.935067 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:16 crc kubenswrapper[4694]: I0217 16:43:16.935141 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:16Z","lastTransitionTime":"2026-02-17T16:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.038132 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.038191 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.038209 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.038234 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.038250 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.140100 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.140184 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.140209 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.140243 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.140270 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.242556 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.242663 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.242706 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.242739 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.242762 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.345600 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.345654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.345666 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.345692 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.345704 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.449090 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.449239 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.449264 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.449295 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.449316 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.551794 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.551866 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.551889 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.551938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.551975 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.653911 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.653944 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.653953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.653966 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.653977 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.756181 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.756222 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.756232 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.756249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.756262 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.859019 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.859058 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.859069 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.859085 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.859096 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.896885 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 19:44:34.013255918 +0000 UTC Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.962078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.962116 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.962125 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.962140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:17 crc kubenswrapper[4694]: I0217 16:43:17.962150 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:17Z","lastTransitionTime":"2026-02-17T16:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.064800 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.064827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.064837 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.064848 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.064857 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.167291 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.167340 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.167745 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.167812 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.167836 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.271028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.271078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.271095 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.271116 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.271129 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.373447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.373523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.373541 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.373564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.373581 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.475788 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.475826 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.475835 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.475849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.475861 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.578530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.578752 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.578793 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.578829 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.578853 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.682078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.682134 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.682146 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.682164 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.682176 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.784845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.784888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.784902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.784919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.784931 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.888364 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.888429 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.888444 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.888465 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.888482 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.895001 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.895018 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.895069 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.895118 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:18 crc kubenswrapper[4694]: E0217 16:43:18.895227 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:18 crc kubenswrapper[4694]: E0217 16:43:18.895333 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:18 crc kubenswrapper[4694]: E0217 16:43:18.895426 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:18 crc kubenswrapper[4694]: E0217 16:43:18.895565 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.897186 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 14:29:18.552895236 +0000 UTC Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.991179 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.991224 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.991238 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.991255 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:18 crc kubenswrapper[4694]: I0217 16:43:18.991266 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:18Z","lastTransitionTime":"2026-02-17T16:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.094107 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.094161 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.094186 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.094253 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.094272 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.107141 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:19 crc kubenswrapper[4694]: E0217 16:43:19.107296 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:19 crc kubenswrapper[4694]: E0217 16:43:19.107375 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:43:51.10735313 +0000 UTC m=+98.864428484 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.195998 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.196063 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.196084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.196108 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.196126 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.298751 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.298810 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.298823 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.298839 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.298850 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.400973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.400999 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.401006 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.401020 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.401029 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.503161 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.503197 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.503210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.503225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.503238 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.605985 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.606018 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.606027 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.606041 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.606051 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.708428 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.708471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.708515 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.708542 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.708554 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.810659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.810699 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.810708 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.810725 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.810736 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.897319 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:47:18.972503907 +0000 UTC Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.914017 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.914056 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.914066 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.914081 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:19 crc kubenswrapper[4694]: I0217 16:43:19.914092 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:19Z","lastTransitionTime":"2026-02-17T16:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.016919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.016967 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.016983 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.017001 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.017013 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.077869 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.077902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.077913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.077927 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.077936 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.091925 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:20Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.095395 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.095419 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.095428 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.095442 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.095453 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.107355 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:20Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.110560 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.110586 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.110595 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.110619 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.110628 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.121851 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:20Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.126176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.126276 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.126296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.126320 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.126337 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.138873 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:20Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.144547 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.144599 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.144643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.144667 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.144686 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.161921 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:20Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.162164 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.163840 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.163922 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.164033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.164048 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.164057 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.266355 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.266401 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.266418 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.266440 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.266457 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.368538 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.368584 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.368596 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.368632 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.368645 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.470476 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.470525 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.470534 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.470548 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.470560 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.573057 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.573112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.573127 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.573147 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.573163 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.675432 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.675492 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.675510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.675527 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.675539 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.777950 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.778000 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.778021 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.778051 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.778074 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.880361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.880398 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.880408 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.880423 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.880434 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.894891 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.894932 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.894968 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.894940 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.895086 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.895198 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.895421 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:20 crc kubenswrapper[4694]: E0217 16:43:20.895497 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.897679 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 20:58:53.539301603 +0000 UTC Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.908478 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.982762 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.982810 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.982826 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.982845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:20 crc kubenswrapper[4694]: I0217 16:43:20.982862 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:20Z","lastTransitionTime":"2026-02-17T16:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.085979 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.086033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.086055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.086099 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.086122 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.188157 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.188201 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.188217 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.188241 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.188256 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.289825 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.289864 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.289873 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.289887 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.289896 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.391628 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.391664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.391673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.391688 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.391698 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.494280 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.494314 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.494324 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.494336 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.494346 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.596505 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.596556 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.596578 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.596600 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.596639 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.698524 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.698563 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.698574 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.698591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.698728 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.800673 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.800706 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.800716 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.800729 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.800739 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.898391 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:50:01.920296615 +0000 UTC Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.902832 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.902867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.902877 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.902890 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:21 crc kubenswrapper[4694]: I0217 16:43:21.902900 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:21Z","lastTransitionTime":"2026-02-17T16:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.004767 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.004804 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.004818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.004859 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.004869 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.107184 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.107235 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.107245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.107261 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.107273 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.209687 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.209738 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.209753 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.209769 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.209781 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.312136 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.312185 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.312194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.312208 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.312221 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.320357 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/0.log" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.320400 4694 generic.go:334] "Generic (PLEG): container finished" podID="428dd081-b1bb-404f-856a-f33a1fa7c24a" containerID="3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d" exitCode=1 Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.320427 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerDied","Data":"3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.321162 4694 scope.go:117] "RemoveContainer" containerID="3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.336687 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.347667 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.365302 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.379373 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.391077 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.403098 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.414577 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.414643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.414655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.414672 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.414684 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.416241 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.426204 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.435631 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.448112 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.461740 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.471753 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.484006 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.503437 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.515411 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.517057 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.517091 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.517125 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.517142 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.517155 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.526919 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.539902 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.549341 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.562178 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.618815 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.618857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.618867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.618886 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.618899 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.721178 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.721204 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.721211 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.721225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.721233 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.823272 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.823316 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.823328 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.823345 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.823357 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.895171 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.895207 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.895230 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:22 crc kubenswrapper[4694]: E0217 16:43:22.895322 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.895339 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:22 crc kubenswrapper[4694]: E0217 16:43:22.895484 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:22 crc kubenswrapper[4694]: E0217 16:43:22.895538 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:22 crc kubenswrapper[4694]: E0217 16:43:22.895641 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.898519 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 09:57:41.629404361 +0000 UTC Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.910251 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.920054 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.925647 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.925694 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.925707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.925724 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.925733 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:22Z","lastTransitionTime":"2026-02-17T16:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.930262 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.942458 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.965721 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.977122 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:22 crc kubenswrapper[4694]: I0217 16:43:22.993381 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:22Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.005301 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.023766 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.027701 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.027748 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.027758 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.027776 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.027789 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.043398 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.058539 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.070454 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.079989 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.093895 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.107047 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.121465 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.130897 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.130930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.130939 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.130952 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.130962 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.135057 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.150967 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.165483 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.232790 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.232867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.232880 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.232897 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.232910 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.324134 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/0.log" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.324187 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerStarted","Data":"2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.334686 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.335237 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.335274 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.335284 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.335298 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.335314 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.351627 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.363114 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.371898 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.382959 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.392311 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.401769 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.413137 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.426127 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437414 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437695 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437707 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437723 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.437733 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.452554 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.466311 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.477326 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.490516 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.502631 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.512771 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.526185 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.540370 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.540404 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.540413 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.540426 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.540436 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.550806 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.561244 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:23Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.643163 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.643203 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.643214 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.643230 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.643244 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.745560 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.745591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.745599 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.745624 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.745632 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.847622 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.847659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.847668 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.847682 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.847691 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.899343 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 06:56:21.058816107 +0000 UTC Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.950549 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.950654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.950681 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.950714 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:23 crc kubenswrapper[4694]: I0217 16:43:23.950737 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:23Z","lastTransitionTime":"2026-02-17T16:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.052908 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.052951 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.052964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.052981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.052992 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.156148 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.156510 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.156522 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.156535 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.156544 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.258528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.258591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.258626 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.258647 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.258659 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.361349 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.361398 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.361410 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.361427 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.361439 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.464001 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.464045 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.464056 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.464072 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.464084 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.565938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.565973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.565986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.566002 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.566014 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.668135 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.668157 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.668165 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.668176 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.668185 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.770139 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.770173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.770185 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.770201 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.770214 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.872852 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.872890 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.872902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.872919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.872933 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.894673 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:24 crc kubenswrapper[4694]: E0217 16:43:24.894798 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.894839 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.894900 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:24 crc kubenswrapper[4694]: E0217 16:43:24.894948 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:24 crc kubenswrapper[4694]: E0217 16:43:24.895108 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.895130 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:24 crc kubenswrapper[4694]: E0217 16:43:24.895245 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.899484 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 03:18:46.672221575 +0000 UTC Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.975477 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.975889 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.975913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.975933 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:24 crc kubenswrapper[4694]: I0217 16:43:24.975945 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:24Z","lastTransitionTime":"2026-02-17T16:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.077720 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.077777 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.077800 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.077828 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.077853 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.181025 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.181068 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.181079 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.181097 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.181110 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.283422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.283471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.283481 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.283497 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.283508 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.385973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.386016 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.386032 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.386054 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.386072 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.488509 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.488546 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.488554 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.488583 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.488594 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.590768 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.590813 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.590826 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.590844 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.590857 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.693172 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.693215 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.693225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.693240 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.693251 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.795332 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.795370 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.795378 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.795412 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.795421 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.895705 4694 scope.go:117] "RemoveContainer" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.899891 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:58:22.164408224 +0000 UTC Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.903053 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.903114 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.903133 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.903155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:25 crc kubenswrapper[4694]: I0217 16:43:25.903177 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:25Z","lastTransitionTime":"2026-02-17T16:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.026899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.026958 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.026973 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.026990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.027001 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.128837 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.128871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.128879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.128893 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.128902 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.231024 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.231064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.231077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.231094 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.231106 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.333439 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.333599 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.333651 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.333682 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.333704 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.335284 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/2.log" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.340939 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.342017 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.366977 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.381836 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.392095 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.416138 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.436046 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.436071 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.436078 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.436090 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.436099 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.442084 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.460226 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.477273 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.490169 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.508439 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.522898 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.532903 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.538091 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.538126 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.538134 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.538150 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.538162 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.547228 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.560633 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.576666 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.592067 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.609168 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.619798 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.630679 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.640664 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.640705 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.640717 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.640734 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.640745 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.645991 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:26Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.742990 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.743248 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.743352 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.743453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.743561 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.846630 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.846671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.846681 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.846698 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.846711 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.894672 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.894701 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:26 crc kubenswrapper[4694]: E0217 16:43:26.894784 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.894891 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:26 crc kubenswrapper[4694]: E0217 16:43:26.894996 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.895152 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:26 crc kubenswrapper[4694]: E0217 16:43:26.895205 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:26 crc kubenswrapper[4694]: E0217 16:43:26.895314 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.899986 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 04:11:04.082260206 +0000 UTC Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.949207 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.949249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.949260 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.949276 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:26 crc kubenswrapper[4694]: I0217 16:43:26.949287 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:26Z","lastTransitionTime":"2026-02-17T16:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.051749 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.054828 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.054857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.054878 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.054892 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.157365 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.157417 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.157430 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.157446 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.157457 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.260480 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.260568 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.260593 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.260667 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.260695 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.345104 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/3.log" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.346101 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/2.log" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.348139 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" exitCode=1 Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.348171 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.348200 4694 scope.go:117] "RemoveContainer" containerID="11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.348769 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:43:27 crc kubenswrapper[4694]: E0217 16:43:27.348925 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.363124 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.363152 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.363159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.363171 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.363180 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.366399 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.397746 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.445820 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.465022 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.466047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.466069 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.466077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.466088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.466097 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.480570 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.494771 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.507982 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.519780 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.534458 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.549165 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.564469 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.568273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.568316 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.568329 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.568348 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.568362 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.581244 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.599568 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11a3609957de770fbce412e8d2d70c3dee43cfb1a6344f51991d13f0289054de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:42:56Z\\\",\\\"message\\\":\\\" reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255684 6402 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 16:42:56.255875 6402 factory.go:656] Stopping watch factory\\\\nI0217 16:42:56.255919 6402 ovnkube.go:599] Stopped ovnkube\\\\nI0217 16:42:56.255963 6402 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 16:42:56.256059 6402 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:26Z\\\",\\\"message\\\":\\\"scaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 16:43:26.898079 6800 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 16:43:26.898091 6800 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-pj7v4\\\\nI0217 16:43:26.898767 6800 services_controller.go:452] Built service openshift-machine-api/cluster-autoscaler-operator per-node LB for network=default: []services.LB{}\\\\nF0217 16:43:26.898788 6800 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:43:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.609860 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.623037 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.633351 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.643034 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.656400 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670340 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:27Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670918 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670967 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670984 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.670997 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.773814 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.773891 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.773921 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.773948 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.773967 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.876397 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.876452 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.876468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.876488 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.876503 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.900521 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 04:03:35.794427827 +0000 UTC Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.979213 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.979273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.979288 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.979309 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:27 crc kubenswrapper[4694]: I0217 16:43:27.979323 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:27Z","lastTransitionTime":"2026-02-17T16:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.082257 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.082330 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.082358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.082388 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.082414 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.185479 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.185530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.185547 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.185571 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.185588 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.288421 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.288466 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.288479 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.288495 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.288507 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.353740 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/3.log" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.361162 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:43:28 crc kubenswrapper[4694]: E0217 16:43:28.361360 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.375428 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.390007 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.393997 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.394288 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.394304 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.394318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.394327 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.406317 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.419457 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.431785 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.442185 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.455056 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.468680 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.486479 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.497195 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.497227 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.497234 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.497246 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.497256 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.505919 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:26Z\\\",\\\"message\\\":\\\"scaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 16:43:26.898079 6800 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 16:43:26.898091 6800 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-pj7v4\\\\nI0217 16:43:26.898767 6800 services_controller.go:452] Built service openshift-machine-api/cluster-autoscaler-operator per-node LB for network=default: []services.LB{}\\\\nF0217 16:43:26.898788 6800 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:43:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.517944 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.529702 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.543372 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.553161 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.567963 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.580741 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.592055 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.599123 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.599166 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.599175 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.599190 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.599202 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.610144 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.623063 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:28Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.702110 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.702157 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.702168 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.702184 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.702194 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.805061 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.805104 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.805117 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.805133 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.805143 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.895129 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.895247 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:28 crc kubenswrapper[4694]: E0217 16:43:28.895320 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.895354 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.895362 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:28 crc kubenswrapper[4694]: E0217 16:43:28.895478 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:28 crc kubenswrapper[4694]: E0217 16:43:28.895595 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:28 crc kubenswrapper[4694]: E0217 16:43:28.895824 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.901682 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 10:57:56.584719038 +0000 UTC Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.906867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.906927 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.906938 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.906953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:28 crc kubenswrapper[4694]: I0217 16:43:28.906964 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:28Z","lastTransitionTime":"2026-02-17T16:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.008929 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.008964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.008976 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.008991 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.009019 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.111072 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.111113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.111124 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.111140 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.111152 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.214740 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.214795 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.214817 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.214849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.214871 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.317416 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.317477 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.317497 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.317521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.317537 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.420818 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.420920 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.420944 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.420978 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.421002 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.523365 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.523422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.523442 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.523468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.523485 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.626696 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.626783 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.626799 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.626822 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.626837 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.733476 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.733541 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.733556 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.733585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.733599 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.836158 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.836221 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.836236 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.836257 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.836273 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.902702 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 11:31:19.366813461 +0000 UTC Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.939632 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.939669 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.939677 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.939690 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:29 crc kubenswrapper[4694]: I0217 16:43:29.939699 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:29Z","lastTransitionTime":"2026-02-17T16:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.042089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.042137 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.042146 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.042158 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.042166 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.144475 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.144523 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.144537 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.144559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.144574 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.246823 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.246888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.246912 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.246937 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.246957 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.349772 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.349815 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.349823 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.349838 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.349848 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.388473 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.388522 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.388543 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.388564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.388579 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.406661 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:30Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.410168 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.410194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.410201 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.410214 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.410226 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.426301 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:30Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.430485 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.430551 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.430573 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.430646 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.430665 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.449268 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:30Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.453676 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.453754 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.453778 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.453807 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.453827 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.466721 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:30Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.469676 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.469718 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.469730 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.469750 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.469763 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.482733 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:30Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.482843 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.484274 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.484315 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.484326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.484344 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.484357 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.587208 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.587252 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.587269 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.587292 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.587308 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.690381 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.690432 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.690444 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.690461 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.690476 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.793870 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.793925 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.793940 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.793964 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.793978 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.894682 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.894688 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.894769 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.894808 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.894945 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.895032 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.895144 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:30 crc kubenswrapper[4694]: E0217 16:43:30.895486 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.896311 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.896372 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.896390 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.896413 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.896429 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.903424 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:30:51.347068908 +0000 UTC Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.998661 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.998726 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.998739 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.998756 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:30 crc kubenswrapper[4694]: I0217 16:43:30.998768 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:30Z","lastTransitionTime":"2026-02-17T16:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.101378 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.101421 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.101433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.101448 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.101459 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.205576 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.205636 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.205647 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.205665 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.205677 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.308521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.308573 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.308584 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.308601 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.308636 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.412550 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.412595 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.412604 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.412632 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.412641 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.515939 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.516024 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.516047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.516079 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.516101 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.618117 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.618164 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.618175 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.618192 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.618206 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.720977 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.721043 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.721066 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.721089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.721105 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.823944 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.823985 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.823996 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.824015 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.824026 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.904294 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 22:48:11.790430896 +0000 UTC Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.926896 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.926976 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.927012 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.927044 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:31 crc kubenswrapper[4694]: I0217 16:43:31.927067 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:31Z","lastTransitionTime":"2026-02-17T16:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.029781 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.029840 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.029857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.029881 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.029899 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.133521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.133582 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.133601 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.133654 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.133674 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.235915 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.236029 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.236055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.236094 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.236180 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.339017 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.339069 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.339088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.339115 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.339132 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.442853 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.442912 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.442930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.442953 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.442972 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.545661 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.545716 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.545733 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.545755 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.545774 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.649796 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.649832 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.649842 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.649858 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.649869 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.752967 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.753056 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.753073 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.753096 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.753114 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.855504 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.855594 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.855640 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.855668 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.855725 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.894787 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.894861 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:32 crc kubenswrapper[4694]: E0217 16:43:32.895164 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.895201 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:32 crc kubenswrapper[4694]: E0217 16:43:32.895334 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.895384 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:32 crc kubenswrapper[4694]: E0217 16:43:32.895570 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:32 crc kubenswrapper[4694]: E0217 16:43:32.895708 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.905238 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:18:38.044271061 +0000 UTC Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.914761 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:32Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.932577 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:32Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.952602 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:32Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.958143 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.958235 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.958249 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.958304 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.958324 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:32Z","lastTransitionTime":"2026-02-17T16:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.970286 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:32Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:32 crc kubenswrapper[4694]: I0217 16:43:32.997342 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:32Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.026485 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:26Z\\\",\\\"message\\\":\\\"scaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 16:43:26.898079 6800 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 16:43:26.898091 6800 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-pj7v4\\\\nI0217 16:43:26.898767 6800 services_controller.go:452] Built service openshift-machine-api/cluster-autoscaler-operator per-node LB for network=default: []services.LB{}\\\\nF0217 16:43:26.898788 6800 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:43:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.045096 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.065366 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.065422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.065440 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.065693 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.065717 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.078310 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.101932 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.116996 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.129173 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.147001 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.163569 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.168892 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.168967 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.168991 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.169027 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.169050 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.178503 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.196269 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.216654 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.233718 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.251478 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.264813 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:33Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.271811 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.271837 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.271848 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.271865 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.271877 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.374384 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.374638 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.374646 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.374658 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.374667 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.477206 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.477271 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.477294 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.477322 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.477346 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.580836 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.580898 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.580922 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.580950 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.580973 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.684257 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.684336 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.684359 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.684388 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.684415 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.786245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.786297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.786307 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.786325 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.786337 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.888401 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.888447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.888461 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.888478 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.888492 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.906125 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:09:09.564980178 +0000 UTC Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.992808 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.992854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.992863 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.992879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:33 crc kubenswrapper[4694]: I0217 16:43:33.992887 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:33Z","lastTransitionTime":"2026-02-17T16:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.095459 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.095542 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.095562 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.095587 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.095636 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.199197 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.199262 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.199278 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.199302 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.199322 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.302158 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.302194 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.302203 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.302217 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.302227 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.404808 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.404862 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.404878 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.404900 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.404928 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.507248 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.507298 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.507316 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.507351 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.507366 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.609815 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.609903 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.609929 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.609959 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.609980 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.712952 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.713010 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.713022 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.713041 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.713054 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.815997 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.816050 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.816067 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.816089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.816106 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.894818 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.894814 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.895008 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:34 crc kubenswrapper[4694]: E0217 16:43:34.895165 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.895256 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:34 crc kubenswrapper[4694]: E0217 16:43:34.895652 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:34 crc kubenswrapper[4694]: E0217 16:43:34.895826 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:34 crc kubenswrapper[4694]: E0217 16:43:34.896044 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.906450 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 02:28:49.495599785 +0000 UTC Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.918646 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.918685 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.918720 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.918735 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:34 crc kubenswrapper[4694]: I0217 16:43:34.918748 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:34Z","lastTransitionTime":"2026-02-17T16:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.021657 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.021747 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.021773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.021795 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.021811 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.124786 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.124849 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.124871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.124899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.124922 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.228185 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.228393 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.228425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.228453 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.228469 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.331487 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.331559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.331585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.331649 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.331708 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.434375 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.434410 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.434420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.434432 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.434440 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.536353 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.536379 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.536387 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.536399 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.536407 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.638557 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.638651 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.638671 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.638693 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.638711 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.742166 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.742230 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.742266 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.742297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.742362 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.845433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.845485 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.845503 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.845528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.845545 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.907443 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:09:26.130179486 +0000 UTC Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.947126 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.947173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.947184 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.947197 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:35 crc kubenswrapper[4694]: I0217 16:43:35.947207 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:35Z","lastTransitionTime":"2026-02-17T16:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.049559 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.049597 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.049621 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.049636 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.049645 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.152456 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.152511 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.152535 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.152563 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.152585 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.255255 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.255307 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.255320 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.255335 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.255344 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.357829 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.357890 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.357907 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.357930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.357947 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.461253 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.461309 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.461326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.461347 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.461360 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.564028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.564103 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.564118 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.564138 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.564152 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.667224 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.667258 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.667323 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.667357 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.667374 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.687962 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.688066 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.688180 4694 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.688199 4694 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.688284 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.688254081 +0000 UTC m=+148.445329495 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.688310 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.688299523 +0000 UTC m=+148.445374967 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.769956 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.769999 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.770011 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.770026 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.770036 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.788510 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788638 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.788593034 +0000 UTC m=+148.545668358 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.788723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.788767 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788892 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788912 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788924 4694 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788961 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.788951163 +0000 UTC m=+148.546026497 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788892 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.788998 4694 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.789010 4694 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.789045 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.789034046 +0000 UTC m=+148.546109370 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.872782 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.872826 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.872837 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.872853 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.872864 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.894699 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.894752 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.894728 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.894713 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.894922 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.895218 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.895447 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:36 crc kubenswrapper[4694]: E0217 16:43:36.895693 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.908285 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:48:55.239611426 +0000 UTC Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.975801 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.975863 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.975881 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.975904 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:36 crc kubenswrapper[4694]: I0217 16:43:36.975920 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:36Z","lastTransitionTime":"2026-02-17T16:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.078857 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.078912 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.078926 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.078946 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.078957 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.181784 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.181852 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.181868 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.181891 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.181912 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.284313 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.284361 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.284373 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.284391 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.284403 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.389212 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.389296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.389318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.389348 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.389370 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.493190 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.493233 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.493245 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.493264 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.493278 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.595666 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.595701 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.595710 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.595722 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.595731 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.698183 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.698273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.698296 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.698332 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.698354 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.800311 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.800474 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.800503 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.800524 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.800540 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.903221 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.903260 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.903271 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.903284 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.903294 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:37Z","lastTransitionTime":"2026-02-17T16:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:37 crc kubenswrapper[4694]: I0217 16:43:37.908589 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 14:07:28.322045339 +0000 UTC Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.006583 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.006690 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.006727 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.006760 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.006782 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.109504 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.109570 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.109588 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.109652 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.109671 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.211845 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.211906 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.211929 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.211959 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.211983 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.314273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.314326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.314335 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.314349 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.314358 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.416867 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.416923 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.416940 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.416960 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.416975 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.520093 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.520167 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.520190 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.520221 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.520242 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.623131 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.623200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.623222 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.623250 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.623270 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.726422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.726494 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.726517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.726546 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.726569 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.829151 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.829186 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.829198 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.829211 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.829223 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.894973 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.895030 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:38 crc kubenswrapper[4694]: E0217 16:43:38.895206 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.895252 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.895310 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:38 crc kubenswrapper[4694]: E0217 16:43:38.895418 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:38 crc kubenswrapper[4694]: E0217 16:43:38.895699 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:38 crc kubenswrapper[4694]: E0217 16:43:38.895983 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.909254 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:44:41.619227282 +0000 UTC Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.930756 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.930787 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.930796 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.930809 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:38 crc kubenswrapper[4694]: I0217 16:43:38.930818 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:38Z","lastTransitionTime":"2026-02-17T16:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.033353 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.033420 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.033433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.033451 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.033462 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.137096 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.137159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.137180 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.137220 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.137241 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.239492 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.239536 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.239553 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.239569 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.239580 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.342188 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.342240 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.342255 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.342275 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.342292 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.444102 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.444143 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.444161 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.444173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.444182 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.546412 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.546443 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.546451 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.546464 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.546473 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.648819 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.648906 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.648917 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.648931 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.648941 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.751299 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.751408 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.751473 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.751519 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.751539 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.854131 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.854167 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.854178 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.854203 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.854216 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.910346 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 14:12:59.40329919 +0000 UTC Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.956396 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.956463 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.956473 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.956490 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:39 crc kubenswrapper[4694]: I0217 16:43:39.956500 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:39Z","lastTransitionTime":"2026-02-17T16:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.060527 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.060621 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.060634 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.060653 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.060666 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.164097 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.164155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.164168 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.164185 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.164198 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.267293 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.267373 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.267392 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.267417 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.267435 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.370499 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.370564 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.370585 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.370643 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.370662 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.473821 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.473880 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.473890 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.473908 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.473925 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.577723 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.577781 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.577792 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.577814 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.577829 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.680317 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.680351 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.680360 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.680374 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.680383 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.784091 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.784155 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.784169 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.784186 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.784198 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.872537 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.872697 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.872729 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.872761 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.872784 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.890078 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.894732 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.894774 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.894852 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.895033 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.895149 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.895279 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.895588 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.895796 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.895930 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.896118 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.896292 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.896371 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.898153 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.898325 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.898364 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.910670 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:52:45.029759674 +0000 UTC Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.919234 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.923423 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.923482 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.923492 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.923514 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.923529 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.940026 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.944506 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.944530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.944538 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.944569 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.944580 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.958064 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.962659 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.962714 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.962734 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.962759 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.962777 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.978535 4694 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3d94249-43cc-4da5-9743-6861e47e40f5\\\",\\\"systemUUID\\\":\\\"2d3fb0f7-c717-4d67-9d61-62b30d044694\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:40Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:40 crc kubenswrapper[4694]: E0217 16:43:40.978788 4694 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.981852 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.981904 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.981920 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.981943 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:40 crc kubenswrapper[4694]: I0217 16:43:40.981958 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:40Z","lastTransitionTime":"2026-02-17T16:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.084888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.084963 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.084986 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.085014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.085036 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.189650 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.189704 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.189718 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.189755 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.189772 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.293750 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.293810 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.293841 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.293869 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.293891 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.396398 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.396474 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.396496 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.396521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.396540 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.499389 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.499436 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.499446 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.499463 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.499474 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.602445 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.602483 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.602492 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.602506 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.602516 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.705741 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.705879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.705913 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.705945 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.705967 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.809323 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.809383 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.809406 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.809439 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.809462 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.910804 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:48:50.24361581 +0000 UTC Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.911959 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.912020 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.912032 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.912050 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:41 crc kubenswrapper[4694]: I0217 16:43:41.912061 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:41Z","lastTransitionTime":"2026-02-17T16:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.013854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.014186 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.014298 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.014380 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.014461 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.117879 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.118197 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.118300 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.118390 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.118469 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.220698 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.220746 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.220757 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.220776 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.220788 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.323892 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.323958 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.323978 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.324000 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.324016 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.426551 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.426660 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.426674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.426697 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.426712 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.530077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.530147 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.530160 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.530182 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.530197 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.632930 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.632960 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.632969 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.632983 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.632994 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.736173 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.736433 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.736509 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.736628 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.736710 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.839150 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.839185 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.839195 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.839210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.839222 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.894794 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.894801 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:42 crc kubenswrapper[4694]: E0217 16:43:42.895025 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.894865 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.894863 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:42 crc kubenswrapper[4694]: E0217 16:43:42.895096 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:42 crc kubenswrapper[4694]: E0217 16:43:42.895163 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:42 crc kubenswrapper[4694]: E0217 16:43:42.895240 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.905763 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974057b2-a009-4d99-8bad-e50b651c8c3c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm5nv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4qb4m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.910952 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:20:50.210486878 +0000 UTC Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.915979 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e542f8-47b2-42aa-91cf-bc06f1077abb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a52db77a43e79b89264cf792a7dfde887fd48f5e07315d1972ca8d090b275f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511445abc827b1c8406a47beaae1273199a03803d373c3314590a7f3163d3d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1f24271d0235894c195afc189f75a3e079c5a3307a632c41eee5c43c29fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be4e49d9b7083f1d9f43c54399e7aa6bb9685ba9c4b730b0a2d327796fc1878d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.928072 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b44794eb68f32a46ab8733798b5e3563de9877f87a5908efb1efebc5314c84d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.939015 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5rjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3df13ae5-41cc-4e30-9a22-b3cde3ceeb5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73e5bb132f36c8acc1817f8b5f458ca556a56063753f4d415ca6f6b55b59292\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-btbkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5rjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.940985 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.941006 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.941014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.941028 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.941036 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:42Z","lastTransitionTime":"2026-02-17T16:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.954442 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d42qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3af5c84a-80ed-47ac-a79d-25b46c8e956e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6ac77797b6d540fc1dd21203b6b3cb6f4c20d25961bb8ef6c7d1d67636225ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcc1e67ed84e55b21d73b1aa59003b9ea2fa0a6615b40e4084802a55bb480fcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24e97cd1997717a894ad0e762ed1bd879fd1f1cc035859c771f0e4dd3d2383b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae954cc3ca4b8bb783fbca6313841eeddb9fb442a71cf55993abfd3829a88315\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bdc26d171301d47ba7596a0f4a42af83eedd031923a069b6cd2c48a3134ae5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://247312b1d20fae027ce34bf84ab2fc0390a39842e95728cd199679e15ce45b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcbc88a745a51173752e933f4e14c30b740351f8f34a5a77bc45fece4e8c047e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k98nx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d42qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.974053 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d15f1d18-d80a-4fc0-a710-a95c74465b6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:26Z\\\",\\\"message\\\":\\\"scaler-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.245\\\\\\\", Port:9192, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 16:43:26.898079 6800 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 16:43:26.898091 6800 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-pj7v4\\\\nI0217 16:43:26.898767 6800 services_controller.go:452] Built service openshift-machine-api/cluster-autoscaler-operator per-node LB for network=default: []services.LB{}\\\\nF0217 16:43:26.898788 6800 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network contr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:43:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hpm9k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8fjpm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.986535 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:42 crc kubenswrapper[4694]: I0217 16:43:42.995948 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6899ac0e-50b3-49cf-99f3-4717d5698029\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53ce91772a2a89f36558f74b3d2ba9377ff2437c3071ca983700df7d41a884f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51e83cb31906ddae327fc3d887d5b5c3e8ad99e6d686ac6da7e3ad5a03571fc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:42Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.019906 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"531860f4-07a9-488d-964f-55c4f459307b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec42134d94056c06a49b087abbd483f4095e5d1ab6b18a635b5faff9064f7ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6be3e93794a448c75f042f24234b3e54dc5f14502e44cfd49e9648ad37adf63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0ca20b8014b1db6f12f9d19525c64d2efaafe72930e27b9b7496c7a2a67e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3cad3f7615b0e316ff53d1eb217eeb121f312972565cb6b78359060e3f509319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33d5d1424c3cf83ee26237d573469d886b72cc802af62fd7997363d35326e1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c1363d3baf96f9f4da994a28dd1db9a5c0ad7d4d60a0dc483fd3eaec917c338\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bfcb3a74c0e8ea3be62c34f312ca0d8634c4ef495aae056b5c6907646202b05\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4562508d3cc52362a636a996bab889fe16fc1cf63b086cd151c2fb073d503868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.037660 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e152d0e3-8cc4-49c4-adeb-fa8710dbcf34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"message\\\":\\\"ed a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\"\\\\nI0217 16:42:33.273638 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 16:42:33.276781 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 16:42:33.276812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 16:42:33.276836 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 16:42:33.276841 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 16:42:33.288201 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 16:42:33.288224 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288229 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 16:42:33.288234 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 16:42:33.288237 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 16:42:33.288239 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 16:42:33.288242 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 16:42:33.288329 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0217 16:42:33.293537 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2693711308/tls.crt::/tmp/serving-cert-2693711308/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771346547\\\\\\\\\\\\\\\" (2026-02-17 16:42:26 +0000 UTC to 2026-03-19 16:42:27 +0000 UTC (now=2026-02-17 16:42:33.293512446 +0000 UTC))\\\\\\\"\\\\nF0217 16:42:33.293541 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.043639 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.043712 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.043729 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.043854 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.043872 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.050351 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-r6gvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8178c0dd-5081-42ba-ae9d-3384017e0cb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60df5f4994dea66b7e723b30bd800575c2b4664db8f4ba82246b0049b8ea5e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dfc4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-r6gvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.068487 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pj7v4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428dd081-b1bb-404f-856a-f33a1fa7c24a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:43:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T16:43:21Z\\\",\\\"message\\\":\\\"2026-02-17T16:42:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630\\\\n2026-02-17T16:42:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d89bed3-0011-4fdd-9f1a-c953c7389630 to /host/opt/cni/bin/\\\\n2026-02-17T16:42:36Z [verbose] multus-daemon started\\\\n2026-02-17T16:42:36Z [verbose] Readiness Indicator file check\\\\n2026-02-17T16:43:21Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:43:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5qqgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pj7v4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.083791 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05e7e385-beb4-4e06-8718-fd68e90ba74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4c1f77697b39a4723b1878058b93642689bc6e1f87c2a562ba6252ecfe186e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2mdrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b5hgc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.097020 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d99a34c-188b-4df8-9d10-48ca322b8d9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66a8e8425443288e3f5c88e4c086d60699397a62fb820401b6d3b0936eb2ac24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55c317ec850e28a1dab464354ba6f54e4dd513991096fcdd299659b7b1d3d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zr7ht\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g8vpr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.111439 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14fe5944-48c7-4d87-9de7-9598f2ddec8c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baf33b9c18b80d91ae9cfb76cedb99f7b4a9cfaa921c719bd86e29b56b5c3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4e6076000dfbe9b9f410152f1176361a939df5aa366d1ac550045182673a1b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3465e9d98221b3eeb82bdf00c55a3f7b36d1051ae7b60913aff457312ae98494\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T16:42:12Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.123995 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61ab8d027d03eec4c49539a5d361db58631252a66dc361ce0c25241f0d5534f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.138392 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.146023 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.146063 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.146074 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.146088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.146099 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.152109 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.164987 4694 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T16:42:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc54d8c7c33c9db85d9b4ad0972e907c6ccc3724377bc9b533c4e0293f994a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc24db3b47cfdafc54e7727409fe0920cc98306d54db3efbc27282d16b7fe83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T16:42:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T16:43:43Z is after 2025-08-24T17:21:41Z" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.248166 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.248200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.248210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.248225 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.248235 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.351528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.351592 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.351674 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.351705 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.351727 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.454215 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.454266 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.454279 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.454297 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.454311 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.556450 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.556506 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.556521 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.556542 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.556556 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.659570 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.659655 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.659670 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.659691 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.659707 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.762182 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.762228 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.762240 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.762282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.762291 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.864802 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.864874 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.864889 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.864907 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.864919 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.911680 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 04:36:07.299774666 +0000 UTC Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.967489 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.967547 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.967558 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.967577 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:43 crc kubenswrapper[4694]: I0217 16:43:43.967589 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:43Z","lastTransitionTime":"2026-02-17T16:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.069836 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.069880 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.069892 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.069909 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.069919 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.173084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.173461 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.173478 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.173533 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.173548 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.276834 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.276872 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.276883 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.276899 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.276911 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.379936 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.379993 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.380012 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.380034 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.380052 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.482743 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.482776 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.482816 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.482833 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.482843 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.585644 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.585693 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.585706 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.585723 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.585734 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.688260 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.688305 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.688315 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.688329 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.688339 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.791268 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.791357 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.791369 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.791392 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.791405 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.893821 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.893871 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.893888 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.893905 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.893915 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.894637 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.894736 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.895013 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:44 crc kubenswrapper[4694]: E0217 16:43:44.894948 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:44 crc kubenswrapper[4694]: E0217 16:43:44.895087 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:44 crc kubenswrapper[4694]: E0217 16:43:44.895141 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.895406 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:44 crc kubenswrapper[4694]: E0217 16:43:44.895564 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.912329 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:14:18.356434119 +0000 UTC Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.995989 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.996059 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.996088 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.996109 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:44 crc kubenswrapper[4694]: I0217 16:43:44.996127 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:44Z","lastTransitionTime":"2026-02-17T16:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.099251 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.099326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.099351 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.099381 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.099406 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.201422 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.201469 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.201512 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.201530 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.201541 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.304052 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.304112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.304129 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.304157 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.304173 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.407226 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.407306 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.407324 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.407346 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.407363 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.510567 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.510642 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.510653 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.510668 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.510677 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.613346 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.613389 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.613447 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.613472 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.613503 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.716014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.716125 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.716223 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.716251 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.716268 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.819081 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.819115 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.819123 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.819136 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.819145 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.913086 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 23:06:14.596267988 +0000 UTC Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.922294 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.922400 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.922424 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.922452 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:45 crc kubenswrapper[4694]: I0217 16:43:45.922478 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:45Z","lastTransitionTime":"2026-02-17T16:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.024724 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.024778 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.024789 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.024805 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.024816 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.127753 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.127807 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.127823 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.127846 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.127862 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.231200 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.231290 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.231326 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.231358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.231380 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.334650 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.334724 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.334748 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.334778 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.334799 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.437042 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.437084 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.437095 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.437117 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.437129 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.539735 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.539774 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.539785 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.539820 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.539833 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.642069 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.642102 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.642111 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.642125 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.642133 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.745703 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.745750 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.745761 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.745776 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.745786 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.847742 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.847847 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.847887 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.847906 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.847920 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.895338 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.895377 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.895385 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:46 crc kubenswrapper[4694]: E0217 16:43:46.895448 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:46 crc kubenswrapper[4694]: E0217 16:43:46.895599 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:46 crc kubenswrapper[4694]: E0217 16:43:46.895879 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.896157 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:46 crc kubenswrapper[4694]: E0217 16:43:46.896230 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.914099 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:02:18.352106091 +0000 UTC Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.949827 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.949892 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.949911 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.949946 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:46 crc kubenswrapper[4694]: I0217 16:43:46.949963 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:46Z","lastTransitionTime":"2026-02-17T16:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.056282 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.056331 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.056356 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.056380 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.056392 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.158741 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.158766 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.158773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.158787 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.158805 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.261047 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.261076 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.261099 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.261114 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.261122 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.364971 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.365052 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.365094 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.365112 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.365126 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.469311 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.469401 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.469425 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.469917 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.470164 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.573358 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.573404 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.573416 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.573434 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.573447 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.676591 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.676687 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.676701 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.676717 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.676728 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.779354 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.779431 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.779448 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.779468 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.779484 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.882543 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.882601 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.882649 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.882677 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.882695 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.914550 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:25:44.732502825 +0000 UTC Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.986064 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.986401 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.986498 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.986651 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:47 crc kubenswrapper[4694]: I0217 16:43:47.986760 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:47Z","lastTransitionTime":"2026-02-17T16:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.089741 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.090098 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.090207 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.090308 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.090417 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.192960 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.193014 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.193033 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.193055 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.193070 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.295177 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.295210 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.295220 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.295238 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.295249 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.398887 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.398948 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.398966 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.398997 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.399018 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.501545 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.501627 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.501639 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.501656 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.501667 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.604089 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.604728 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.605043 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.605147 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.605237 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.707634 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.707677 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.707687 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.707721 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.707733 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.810276 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.810317 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.810325 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.810338 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.810351 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.895289 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.895325 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.895358 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:48 crc kubenswrapper[4694]: E0217 16:43:48.895443 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.895293 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:48 crc kubenswrapper[4694]: E0217 16:43:48.895589 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:48 crc kubenswrapper[4694]: E0217 16:43:48.895724 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:48 crc kubenswrapper[4694]: E0217 16:43:48.895810 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.913237 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.913268 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.913279 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.913295 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.913306 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:48Z","lastTransitionTime":"2026-02-17T16:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:48 crc kubenswrapper[4694]: I0217 16:43:48.914933 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 09:48:37.991138695 +0000 UTC Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.016072 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.016146 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.016159 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.016181 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.016196 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.119580 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.120077 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.120238 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.120411 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.120562 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.223371 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.223434 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.223446 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.223462 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.223496 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.326471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.326520 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.326540 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.326561 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.326577 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.429278 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.429324 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.429340 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.429363 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.429380 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.531943 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.531993 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.532004 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.532022 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.532032 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.634332 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.634385 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.634396 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.634413 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.634423 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.736872 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.737032 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.737061 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.737151 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.737265 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.839931 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.839981 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.839992 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.840009 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.840026 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.915960 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 13:36:40.467014224 +0000 UTC Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.941861 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.941905 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.941916 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.941932 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:49 crc kubenswrapper[4694]: I0217 16:43:49.941942 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:49Z","lastTransitionTime":"2026-02-17T16:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.044754 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.044804 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.044819 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.044835 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.044847 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.147814 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.147851 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.147861 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.147875 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.147886 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.250044 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.250113 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.250131 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.250156 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.250174 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.353066 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.353123 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.353141 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.353163 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.353180 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.455392 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.455437 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.455452 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.455471 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.455485 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.557764 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.557844 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.557855 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.557872 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.557883 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.660517 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.660563 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.660572 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.660586 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.660595 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.763330 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.763373 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.763385 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.763399 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.763409 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.865840 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.865902 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.865919 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.865941 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.865958 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.895338 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.895446 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.895771 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:50 crc kubenswrapper[4694]: E0217 16:43:50.895872 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.895927 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:50 crc kubenswrapper[4694]: E0217 16:43:50.896056 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:50 crc kubenswrapper[4694]: E0217 16:43:50.896350 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:50 crc kubenswrapper[4694]: E0217 16:43:50.896835 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.916869 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:07:33.262984502 +0000 UTC Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.967630 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.967700 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.967709 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.967724 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:50 crc kubenswrapper[4694]: I0217 16:43:50.967732 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:50Z","lastTransitionTime":"2026-02-17T16:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.070273 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.070318 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.070329 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.070346 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.070358 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:51Z","lastTransitionTime":"2026-02-17T16:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.136357 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:51 crc kubenswrapper[4694]: E0217 16:43:51.136497 4694 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:51 crc kubenswrapper[4694]: E0217 16:43:51.136577 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs podName:974057b2-a009-4d99-8bad-e50b651c8c3c nodeName:}" failed. No retries permitted until 2026-02-17 16:44:55.136554042 +0000 UTC m=+162.893629366 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs") pod "network-metrics-daemon-4qb4m" (UID: "974057b2-a009-4d99-8bad-e50b651c8c3c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.173721 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.173763 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.173773 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.173789 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.173801 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:51Z","lastTransitionTime":"2026-02-17T16:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.232528 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.232573 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.232582 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.232597 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.232629 4694 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T16:43:51Z","lastTransitionTime":"2026-02-17T16:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.277914 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w"] Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.278417 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.281761 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.282031 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.282178 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.282676 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.312354 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podStartSLOduration=79.312331349 podStartE2EDuration="1m19.312331349s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.297971307 +0000 UTC m=+99.055046621" watchObservedRunningTime="2026-02-17 16:43:51.312331349 +0000 UTC m=+99.069406673" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.327768 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g8vpr" podStartSLOduration=78.327747828 podStartE2EDuration="1m18.327747828s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.312525764 +0000 UTC m=+99.069601098" watchObservedRunningTime="2026-02-17 16:43:51.327747828 +0000 UTC m=+99.084823152" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.327960 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.327955884 podStartE2EDuration="1m19.327955884s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.327912322 +0000 UTC m=+99.084987666" watchObservedRunningTime="2026-02-17 16:43:51.327955884 +0000 UTC m=+99.085031208" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.337940 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.337999 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.338022 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.338056 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.338085 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.401657 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pj7v4" podStartSLOduration=79.401636674 podStartE2EDuration="1m19.401636674s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.401385407 +0000 UTC m=+99.158460741" watchObservedRunningTime="2026-02-17 16:43:51.401636674 +0000 UTC m=+99.158711998" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.416994 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.416970171 podStartE2EDuration="46.416970171s" podCreationTimestamp="2026-02-17 16:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.416889389 +0000 UTC m=+99.173964713" watchObservedRunningTime="2026-02-17 16:43:51.416970171 +0000 UTC m=+99.174045495" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.437847 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5rjgs" podStartSLOduration=79.437827217 podStartE2EDuration="1m19.437827217s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.43753483 +0000 UTC m=+99.194610154" watchObservedRunningTime="2026-02-17 16:43:51.437827217 +0000 UTC m=+99.194902531" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.438862 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.438906 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.438928 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.438968 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.438974 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.439003 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.439066 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.439881 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.447377 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.455097 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03cc36dd-e297-49ab-adb6-4dc8e0cab1d0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rn59w\" (UID: \"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.458119 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d42qm" podStartSLOduration=79.458105439 podStartE2EDuration="1m19.458105439s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.45773901 +0000 UTC m=+99.214814334" watchObservedRunningTime="2026-02-17 16:43:51.458105439 +0000 UTC m=+99.215180763" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.518260 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.518244497 podStartE2EDuration="31.518244497s" podCreationTimestamp="2026-02-17 16:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.517888908 +0000 UTC m=+99.274964252" watchObservedRunningTime="2026-02-17 16:43:51.518244497 +0000 UTC m=+99.275319821" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.539929 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.539912124 podStartE2EDuration="1m18.539912124s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.539138865 +0000 UTC m=+99.296214189" watchObservedRunningTime="2026-02-17 16:43:51.539912124 +0000 UTC m=+99.296987448" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.552326 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.552309697 podStartE2EDuration="1m17.552309697s" podCreationTimestamp="2026-02-17 16:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.551680561 +0000 UTC m=+99.308755885" watchObservedRunningTime="2026-02-17 16:43:51.552309697 +0000 UTC m=+99.309385021" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.562263 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-r6gvx" podStartSLOduration=79.562246018 podStartE2EDuration="1m19.562246018s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:51.56194791 +0000 UTC m=+99.319023234" watchObservedRunningTime="2026-02-17 16:43:51.562246018 +0000 UTC m=+99.319321342" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.591980 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.917802 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:09:49.371531355 +0000 UTC Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.918219 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 16:43:51 crc kubenswrapper[4694]: I0217 16:43:51.925456 4694 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.449329 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" event={"ID":"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0","Type":"ContainerStarted","Data":"325ab42d87e0d97c489930465e30db6c91be6a0bf6a22b7216ab1d3893ae703f"} Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.449738 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" event={"ID":"03cc36dd-e297-49ab-adb6-4dc8e0cab1d0","Type":"ContainerStarted","Data":"e3222b4f3ec8185ecb055dc1496d428cadb62b002d972417e659a2bb0bf6817d"} Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.465985 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rn59w" podStartSLOduration=80.465955121 podStartE2EDuration="1m20.465955121s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:43:52.465209033 +0000 UTC m=+100.222284367" watchObservedRunningTime="2026-02-17 16:43:52.465955121 +0000 UTC m=+100.223030485" Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.895329 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:52 crc kubenswrapper[4694]: E0217 16:43:52.896686 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.896713 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.896747 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.896767 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:52 crc kubenswrapper[4694]: E0217 16:43:52.896803 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:52 crc kubenswrapper[4694]: E0217 16:43:52.896885 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:52 crc kubenswrapper[4694]: E0217 16:43:52.896996 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:52 crc kubenswrapper[4694]: I0217 16:43:52.897853 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:43:52 crc kubenswrapper[4694]: E0217 16:43:52.898972 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:43:54 crc kubenswrapper[4694]: I0217 16:43:54.894519 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:54 crc kubenswrapper[4694]: E0217 16:43:54.894664 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:54 crc kubenswrapper[4694]: I0217 16:43:54.894698 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:54 crc kubenswrapper[4694]: I0217 16:43:54.894733 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:54 crc kubenswrapper[4694]: I0217 16:43:54.894755 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:54 crc kubenswrapper[4694]: E0217 16:43:54.894966 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:54 crc kubenswrapper[4694]: E0217 16:43:54.895084 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:54 crc kubenswrapper[4694]: E0217 16:43:54.895155 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:56 crc kubenswrapper[4694]: I0217 16:43:56.894678 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:56 crc kubenswrapper[4694]: I0217 16:43:56.894736 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:56 crc kubenswrapper[4694]: I0217 16:43:56.894736 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:56 crc kubenswrapper[4694]: E0217 16:43:56.894845 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:56 crc kubenswrapper[4694]: I0217 16:43:56.894876 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:56 crc kubenswrapper[4694]: E0217 16:43:56.895094 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:43:56 crc kubenswrapper[4694]: E0217 16:43:56.895135 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:56 crc kubenswrapper[4694]: E0217 16:43:56.895212 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:58 crc kubenswrapper[4694]: I0217 16:43:58.895424 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:43:58 crc kubenswrapper[4694]: E0217 16:43:58.895561 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:43:58 crc kubenswrapper[4694]: I0217 16:43:58.895640 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:43:58 crc kubenswrapper[4694]: I0217 16:43:58.895719 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:43:58 crc kubenswrapper[4694]: I0217 16:43:58.895657 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:43:58 crc kubenswrapper[4694]: E0217 16:43:58.896138 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:43:58 crc kubenswrapper[4694]: E0217 16:43:58.896174 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:43:58 crc kubenswrapper[4694]: E0217 16:43:58.896216 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:00 crc kubenswrapper[4694]: I0217 16:44:00.894671 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:00 crc kubenswrapper[4694]: E0217 16:44:00.894843 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:00 crc kubenswrapper[4694]: I0217 16:44:00.895495 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:00 crc kubenswrapper[4694]: E0217 16:44:00.895659 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:00 crc kubenswrapper[4694]: I0217 16:44:00.895855 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:00 crc kubenswrapper[4694]: E0217 16:44:00.895938 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:00 crc kubenswrapper[4694]: I0217 16:44:00.895962 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:00 crc kubenswrapper[4694]: E0217 16:44:00.896024 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:02 crc kubenswrapper[4694]: I0217 16:44:02.895362 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:02 crc kubenswrapper[4694]: I0217 16:44:02.895430 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:02 crc kubenswrapper[4694]: I0217 16:44:02.895503 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:02 crc kubenswrapper[4694]: E0217 16:44:02.896846 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:02 crc kubenswrapper[4694]: I0217 16:44:02.896897 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:02 crc kubenswrapper[4694]: E0217 16:44:02.897011 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:02 crc kubenswrapper[4694]: E0217 16:44:02.897224 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:02 crc kubenswrapper[4694]: E0217 16:44:02.897270 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:03 crc kubenswrapper[4694]: I0217 16:44:03.895079 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:44:03 crc kubenswrapper[4694]: E0217 16:44:03.895314 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8fjpm_openshift-ovn-kubernetes(d15f1d18-d80a-4fc0-a710-a95c74465b6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" Feb 17 16:44:04 crc kubenswrapper[4694]: I0217 16:44:04.894697 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:04 crc kubenswrapper[4694]: I0217 16:44:04.894797 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:04 crc kubenswrapper[4694]: I0217 16:44:04.894873 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:04 crc kubenswrapper[4694]: E0217 16:44:04.894860 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:04 crc kubenswrapper[4694]: E0217 16:44:04.895036 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:04 crc kubenswrapper[4694]: E0217 16:44:04.895185 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:04 crc kubenswrapper[4694]: I0217 16:44:04.895866 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:04 crc kubenswrapper[4694]: E0217 16:44:04.896019 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:06 crc kubenswrapper[4694]: I0217 16:44:06.895534 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:06 crc kubenswrapper[4694]: I0217 16:44:06.895553 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:06 crc kubenswrapper[4694]: I0217 16:44:06.895664 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:06 crc kubenswrapper[4694]: I0217 16:44:06.895702 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:06 crc kubenswrapper[4694]: E0217 16:44:06.895878 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:06 crc kubenswrapper[4694]: E0217 16:44:06.895925 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:06 crc kubenswrapper[4694]: E0217 16:44:06.896090 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:06 crc kubenswrapper[4694]: E0217 16:44:06.896750 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.500773 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/1.log" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.501306 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/0.log" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.501349 4694 generic.go:334] "Generic (PLEG): container finished" podID="428dd081-b1bb-404f-856a-f33a1fa7c24a" containerID="2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09" exitCode=1 Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.501384 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerDied","Data":"2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09"} Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.501423 4694 scope.go:117] "RemoveContainer" containerID="3e809248e9e64c68a1ec3d8386898c70bd406109e0692d3dbcafc33a5434286d" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.501917 4694 scope.go:117] "RemoveContainer" containerID="2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09" Feb 17 16:44:08 crc kubenswrapper[4694]: E0217 16:44:08.502188 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pj7v4_openshift-multus(428dd081-b1bb-404f-856a-f33a1fa7c24a)\"" pod="openshift-multus/multus-pj7v4" podUID="428dd081-b1bb-404f-856a-f33a1fa7c24a" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.894956 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.894984 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.894956 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:08 crc kubenswrapper[4694]: E0217 16:44:08.895079 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:08 crc kubenswrapper[4694]: E0217 16:44:08.895175 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:08 crc kubenswrapper[4694]: I0217 16:44:08.895212 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:08 crc kubenswrapper[4694]: E0217 16:44:08.895341 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:08 crc kubenswrapper[4694]: E0217 16:44:08.895421 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:09 crc kubenswrapper[4694]: I0217 16:44:09.508819 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/1.log" Feb 17 16:44:10 crc kubenswrapper[4694]: I0217 16:44:10.895410 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:10 crc kubenswrapper[4694]: I0217 16:44:10.895433 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:10 crc kubenswrapper[4694]: I0217 16:44:10.896444 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:10 crc kubenswrapper[4694]: I0217 16:44:10.896680 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:10 crc kubenswrapper[4694]: E0217 16:44:10.896583 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:10 crc kubenswrapper[4694]: E0217 16:44:10.897093 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:10 crc kubenswrapper[4694]: E0217 16:44:10.897186 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:10 crc kubenswrapper[4694]: E0217 16:44:10.897269 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:12 crc kubenswrapper[4694]: I0217 16:44:12.895205 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:12 crc kubenswrapper[4694]: I0217 16:44:12.895232 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:12 crc kubenswrapper[4694]: I0217 16:44:12.895933 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:12 crc kubenswrapper[4694]: E0217 16:44:12.896419 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:12 crc kubenswrapper[4694]: I0217 16:44:12.896448 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:12 crc kubenswrapper[4694]: E0217 16:44:12.896571 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:12 crc kubenswrapper[4694]: E0217 16:44:12.896755 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:12 crc kubenswrapper[4694]: E0217 16:44:12.896835 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:12 crc kubenswrapper[4694]: E0217 16:44:12.902863 4694 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 16:44:13 crc kubenswrapper[4694]: E0217 16:44:13.070186 4694 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 16:44:14 crc kubenswrapper[4694]: I0217 16:44:14.895172 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:14 crc kubenswrapper[4694]: I0217 16:44:14.895234 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:14 crc kubenswrapper[4694]: I0217 16:44:14.895283 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:14 crc kubenswrapper[4694]: E0217 16:44:14.895329 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:14 crc kubenswrapper[4694]: E0217 16:44:14.895403 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:14 crc kubenswrapper[4694]: E0217 16:44:14.895471 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:14 crc kubenswrapper[4694]: I0217 16:44:14.895685 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:14 crc kubenswrapper[4694]: E0217 16:44:14.895804 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:16 crc kubenswrapper[4694]: I0217 16:44:16.895317 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:16 crc kubenswrapper[4694]: E0217 16:44:16.895487 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:16 crc kubenswrapper[4694]: I0217 16:44:16.895311 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:16 crc kubenswrapper[4694]: E0217 16:44:16.895648 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:16 crc kubenswrapper[4694]: I0217 16:44:16.896092 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:16 crc kubenswrapper[4694]: E0217 16:44:16.896163 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:16 crc kubenswrapper[4694]: I0217 16:44:16.896436 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:16 crc kubenswrapper[4694]: E0217 16:44:16.896580 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:18 crc kubenswrapper[4694]: E0217 16:44:18.071846 4694 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 16:44:18 crc kubenswrapper[4694]: I0217 16:44:18.894863 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:18 crc kubenswrapper[4694]: E0217 16:44:18.895099 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:18 crc kubenswrapper[4694]: I0217 16:44:18.895756 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:18 crc kubenswrapper[4694]: I0217 16:44:18.895853 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:18 crc kubenswrapper[4694]: I0217 16:44:18.895857 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:18 crc kubenswrapper[4694]: E0217 16:44:18.896014 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:18 crc kubenswrapper[4694]: E0217 16:44:18.896404 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:18 crc kubenswrapper[4694]: I0217 16:44:18.896550 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:44:18 crc kubenswrapper[4694]: E0217 16:44:18.896785 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.541305 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/3.log" Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.543520 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerStarted","Data":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.544001 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.567091 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podStartSLOduration=107.567072336 podStartE2EDuration="1m47.567072336s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:19.56603766 +0000 UTC m=+127.323113004" watchObservedRunningTime="2026-02-17 16:44:19.567072336 +0000 UTC m=+127.324147670" Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.714461 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4qb4m"] Feb 17 16:44:19 crc kubenswrapper[4694]: I0217 16:44:19.714548 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:19 crc kubenswrapper[4694]: E0217 16:44:19.714643 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:20 crc kubenswrapper[4694]: I0217 16:44:20.895370 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:20 crc kubenswrapper[4694]: I0217 16:44:20.895461 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:20 crc kubenswrapper[4694]: I0217 16:44:20.895508 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:20 crc kubenswrapper[4694]: E0217 16:44:20.895858 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:20 crc kubenswrapper[4694]: E0217 16:44:20.896061 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:20 crc kubenswrapper[4694]: E0217 16:44:20.896163 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:21 crc kubenswrapper[4694]: I0217 16:44:21.894420 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:21 crc kubenswrapper[4694]: E0217 16:44:21.894582 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:21 crc kubenswrapper[4694]: I0217 16:44:21.894780 4694 scope.go:117] "RemoveContainer" containerID="2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09" Feb 17 16:44:22 crc kubenswrapper[4694]: I0217 16:44:22.554601 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/1.log" Feb 17 16:44:22 crc kubenswrapper[4694]: I0217 16:44:22.554677 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerStarted","Data":"4def840a9ce1c58602b78dea39755e808372c982a588c0057faf28647396f7e5"} Feb 17 16:44:22 crc kubenswrapper[4694]: I0217 16:44:22.894844 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:22 crc kubenswrapper[4694]: I0217 16:44:22.894843 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:22 crc kubenswrapper[4694]: E0217 16:44:22.895943 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:22 crc kubenswrapper[4694]: I0217 16:44:22.896025 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:22 crc kubenswrapper[4694]: E0217 16:44:22.896152 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:22 crc kubenswrapper[4694]: E0217 16:44:22.896275 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:23 crc kubenswrapper[4694]: E0217 16:44:23.074971 4694 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 16:44:23 crc kubenswrapper[4694]: I0217 16:44:23.894991 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:23 crc kubenswrapper[4694]: E0217 16:44:23.895204 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:24 crc kubenswrapper[4694]: I0217 16:44:24.894552 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:24 crc kubenswrapper[4694]: I0217 16:44:24.894746 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:24 crc kubenswrapper[4694]: I0217 16:44:24.894781 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:24 crc kubenswrapper[4694]: E0217 16:44:24.895199 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:24 crc kubenswrapper[4694]: E0217 16:44:24.895306 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:24 crc kubenswrapper[4694]: E0217 16:44:24.895412 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:25 crc kubenswrapper[4694]: I0217 16:44:25.398576 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:44:25 crc kubenswrapper[4694]: I0217 16:44:25.895021 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:25 crc kubenswrapper[4694]: E0217 16:44:25.895210 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:26 crc kubenswrapper[4694]: I0217 16:44:26.895462 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:26 crc kubenswrapper[4694]: I0217 16:44:26.895531 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:26 crc kubenswrapper[4694]: E0217 16:44:26.895685 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 16:44:26 crc kubenswrapper[4694]: I0217 16:44:26.895739 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:26 crc kubenswrapper[4694]: E0217 16:44:26.895848 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 16:44:26 crc kubenswrapper[4694]: E0217 16:44:26.895975 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 16:44:27 crc kubenswrapper[4694]: I0217 16:44:27.894853 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:27 crc kubenswrapper[4694]: E0217 16:44:27.895074 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qb4m" podUID="974057b2-a009-4d99-8bad-e50b651c8c3c" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.895202 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.895244 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.895522 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.897873 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.897905 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.898472 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 16:44:28 crc kubenswrapper[4694]: I0217 16:44:28.906059 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 16:44:29 crc kubenswrapper[4694]: I0217 16:44:29.894508 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:29 crc kubenswrapper[4694]: I0217 16:44:29.896154 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 16:44:29 crc kubenswrapper[4694]: I0217 16:44:29.897189 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.329771 4694 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.377923 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wqqd4"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.378602 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.380502 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.381114 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.381841 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.382357 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.383219 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.384085 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.384166 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.384584 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.385529 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t44rx"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.385970 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.386718 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.387381 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.387636 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 16:44:32 crc kubenswrapper[4694]: W0217 16:44:32.387743 4694 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv": failed to list *v1.Secret: secrets "openshift-apiserver-operator-dockercfg-xtcjv" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 17 16:44:32 crc kubenswrapper[4694]: E0217 16:44:32.387763 4694 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-xtcjv\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-dockercfg-xtcjv\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.387782 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:44:32 crc kubenswrapper[4694]: W0217 16:44:32.387797 4694 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config": failed to list *v1.ConfigMap: configmaps "openshift-apiserver-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 17 16:44:32 crc kubenswrapper[4694]: E0217 16:44:32.387807 4694 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-apiserver-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.388233 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: W0217 16:44:32.400303 4694 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 17 16:44:32 crc kubenswrapper[4694]: E0217 16:44:32.400344 4694 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.400422 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.400661 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.400816 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.400968 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.401299 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 16:44:32 crc kubenswrapper[4694]: W0217 16:44:32.401436 4694 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 17 16:44:32 crc kubenswrapper[4694]: E0217 16:44:32.401453 4694 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.402110 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjgql"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.402102 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.405988 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.406094 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.406720 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.406985 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.404004 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.404104 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.404360 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.404511 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.404941 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.405015 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.405497 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.405826 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.427753 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.428110 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.428319 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.428446 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.429375 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.429566 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.430197 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.430881 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.430965 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.431091 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.431271 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.431409 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.432789 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.432871 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.433083 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.433111 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.433669 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.433874 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.435431 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.439359 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.439535 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.440025 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.440297 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.440696 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.441025 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.441288 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.442891 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nsrtk"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.443357 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.443851 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.444139 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445412 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445469 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445649 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445763 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445875 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.445985 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.446093 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.446239 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.446352 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.446462 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.446568 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.448088 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.448198 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.448350 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.448459 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wqqd4"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.449972 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.450385 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.450493 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.451352 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.451803 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.451900 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452307 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452397 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96t75"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452599 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452814 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452874 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.452964 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.453102 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.453317 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.453845 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.453920 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.453962 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.454215 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.454219 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.454668 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t44rx"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.455141 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wq4dh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.455483 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.455739 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.456483 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmljb"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.456829 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.457761 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.458085 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.470808 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.470846 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.471429 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.488768 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489105 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489163 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489302 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489372 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489414 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.489801 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.490114 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.490291 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.490697 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.493781 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.494139 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.494971 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495081 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495102 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495176 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495259 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495366 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495429 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495492 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495551 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495835 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.495954 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496079 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496199 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496216 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496321 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496418 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496799 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.496904 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.497842 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.498311 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.498937 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499721 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499741 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd476af-578e-47ea-bfae-4b4d1303106c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499760 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499777 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit-dir\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499791 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-config\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499806 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499820 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-encryption-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499833 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-node-pullsecrets\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499849 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67ab93c3-9ab4-409c-b349-9032ff88e45b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499866 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67cd89f-5768-44e0-9d5c-76e27ab585b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499880 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499893 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f479b7-8c1f-4fd0-a106-c912c664579a-metrics-tls\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499907 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-trusted-ca\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499923 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e66f95bf-fb5b-4682-b8b1-910d78519ba4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499940 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499956 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwck8\" (UniqueName: \"kubernetes.io/projected/67ab93c3-9ab4-409c-b349-9032ff88e45b-kube-api-access-gwck8\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499970 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.499986 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8fw\" (UniqueName: \"kubernetes.io/projected/63670946-1f60-490f-b79b-d4bacbc46803-kube-api-access-zs8fw\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500012 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cbe9f19-4c05-4266-b4a8-53af41586325-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500041 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500065 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-serving-cert\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500079 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500094 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jxpg\" (UniqueName: \"kubernetes.io/projected/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-kube-api-access-4jxpg\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500108 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b62f849-06cf-430a-8b44-c0b8b5a652c6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500137 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500151 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghtf5\" (UniqueName: \"kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500166 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500185 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500201 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ms9m\" (UniqueName: \"kubernetes.io/projected/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-kube-api-access-7ms9m\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500218 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500234 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500251 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-proxy-tls\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500266 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500281 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500297 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s2vh\" (UniqueName: \"kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500311 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-image-import-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500327 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cbe9f19-4c05-4266-b4a8-53af41586325-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500342 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500356 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500372 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-config\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500387 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsqsc\" (UniqueName: \"kubernetes.io/projected/17f479b7-8c1f-4fd0-a106-c912c664579a-kube-api-access-qsqsc\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500405 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-encryption-config\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500419 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500434 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-images\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500449 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500462 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-dir\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500510 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgm7c\" (UniqueName: \"kubernetes.io/projected/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-kube-api-access-jgm7c\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500526 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500540 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr45b\" (UniqueName: \"kubernetes.io/projected/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-kube-api-access-fr45b\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500560 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-serving-cert\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500579 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4cp9\" (UniqueName: \"kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500600 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500641 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500661 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500683 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500701 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b62f849-06cf-430a-8b44-c0b8b5a652c6-config\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500726 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zp7\" (UniqueName: \"kubernetes.io/projected/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-kube-api-access-j7zp7\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500758 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500784 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500808 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-serving-cert\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500829 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500854 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-client\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500875 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500899 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl4lf\" (UniqueName: \"kubernetes.io/projected/d3ed4fee-4f78-4018-9a59-d8a98da1659f-kube-api-access-vl4lf\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500893 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500953 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.500914 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501511 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501533 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-serving-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501552 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-policies\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501577 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501595 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ab93c3-9ab4-409c-b349-9032ff88e45b-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501653 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgwh\" (UniqueName: \"kubernetes.io/projected/cbd476af-578e-47ea-bfae-4b4d1303106c-kube-api-access-csgwh\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501694 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66f95bf-fb5b-4682-b8b1-910d78519ba4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501714 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggrh\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-kube-api-access-hggrh\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501746 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501772 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e66f95bf-fb5b-4682-b8b1-910d78519ba4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501793 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlndf\" (UniqueName: \"kubernetes.io/projected/f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2-kube-api-access-mlndf\") pod \"downloads-7954f5f757-nsrtk\" (UID: \"f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2\") " pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501813 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-serving-cert\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501836 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501855 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501873 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-auth-proxy-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501892 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-client\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501914 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-config\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501938 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8rfg\" (UniqueName: \"kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501962 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501976 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b62f849-06cf-430a-8b44-c0b8b5a652c6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.501996 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-kube-api-access-hc6s6\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502013 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc8x\" (UniqueName: \"kubernetes.io/projected/e67cd89f-5768-44e0-9d5c-76e27ab585b7-kube-api-access-6xc8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502040 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502057 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502053 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67cd89f-5768-44e0-9d5c-76e27ab585b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502388 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63670946-1f60-490f-b79b-d4bacbc46803-machine-approver-tls\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.502426 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.503158 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.503657 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.506018 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.515151 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.517527 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.517989 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.518246 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.518550 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.521151 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.521232 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.523152 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.526938 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.527591 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.527820 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.528442 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.531267 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.531328 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.532015 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.532287 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.532714 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.532761 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.532713 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.533071 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kclz7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.554307 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lcq72"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.555467 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.555564 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b24qw"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.555773 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.556693 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t7nr4"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.557083 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.557257 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.558607 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.558766 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.560254 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.561929 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.564182 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.566773 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.569509 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85fqn"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.572665 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjgql"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.572784 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.577844 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gn9k9"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.579364 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.582357 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nq4xh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.583133 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.585791 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.587658 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wq4dh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.588712 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.589354 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nsrtk"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.590490 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96t75"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.590581 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.591670 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.592823 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.594354 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.597197 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.598348 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmljb"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.599736 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.600665 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.601201 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603077 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603460 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfpk\" (UniqueName: \"kubernetes.io/projected/5ca423ac-de33-427a-b561-f04e6631b6d8-kube-api-access-wlfpk\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603499 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl4lf\" (UniqueName: \"kubernetes.io/projected/d3ed4fee-4f78-4018-9a59-d8a98da1659f-kube-api-access-vl4lf\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603527 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603551 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmcmj\" (UniqueName: \"kubernetes.io/projected/6914ad57-b1bc-4449-abe9-02e7183d92a9-kube-api-access-kmcmj\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603587 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603635 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ab93c3-9ab4-409c-b349-9032ff88e45b-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603661 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-socket-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603692 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66f95bf-fb5b-4682-b8b1-910d78519ba4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603755 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e66f95bf-fb5b-4682-b8b1-910d78519ba4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603782 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlndf\" (UniqueName: \"kubernetes.io/projected/f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2-kube-api-access-mlndf\") pod \"downloads-7954f5f757-nsrtk\" (UID: \"f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2\") " pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603806 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-key\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603832 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603856 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-serving-cert\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603916 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-cabundle\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603944 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-client\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603994 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpknn\" (UniqueName: \"kubernetes.io/projected/a5b422d1-061f-4d69-9ed9-247f4930fe99-kube-api-access-fpknn\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604023 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfx9n\" (UniqueName: \"kubernetes.io/projected/c4aa91f8-086e-415b-aadc-da13d3d90ae9-kube-api-access-lfx9n\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604045 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzpzz\" (UniqueName: \"kubernetes.io/projected/b6e9465f-5af8-48cb-b71b-3453e04acb1a-kube-api-access-gzpzz\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604069 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604093 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604115 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmh8\" (UniqueName: \"kubernetes.io/projected/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-kube-api-access-gbmh8\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604144 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604168 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604191 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604216 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-encryption-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604239 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit-dir\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604268 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67cd89f-5768-44e0-9d5c-76e27ab585b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604297 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e66f95bf-fb5b-4682-b8b1-910d78519ba4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604326 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-trusted-ca\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604354 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604379 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604400 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cbe9f19-4c05-4266-b4a8-53af41586325-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604422 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-serving-cert\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604444 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604464 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-images\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604501 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghtf5\" (UniqueName: \"kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604524 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604545 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604571 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-proxy-tls\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604589 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nkm\" (UniqueName: \"kubernetes.io/projected/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-kube-api-access-f8nkm\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604631 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s2vh\" (UniqueName: \"kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604652 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca423ac-de33-427a-b561-f04e6631b6d8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604675 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-image-import-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604694 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604714 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604733 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-encryption-config\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604754 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604776 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604795 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-images\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604812 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-dir\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604832 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgm7c\" (UniqueName: \"kubernetes.io/projected/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-kube-api-access-jgm7c\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604853 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc27r\" (UniqueName: \"kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604877 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604898 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr45b\" (UniqueName: \"kubernetes.io/projected/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-kube-api-access-fr45b\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604921 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4cp9\" (UniqueName: \"kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604945 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-csi-data-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604967 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2wb\" (UniqueName: \"kubernetes.io/projected/5b4e3579-1839-4de9-8e52-71ca7976b353-kube-api-access-4p2wb\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.604991 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605020 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605043 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b62f849-06cf-430a-8b44-c0b8b5a652c6-config\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605068 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605088 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-serving-cert\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605130 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605155 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-client\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605178 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605200 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605224 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-serving-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605249 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-policies\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605282 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605304 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgwh\" (UniqueName: \"kubernetes.io/projected/cbd476af-578e-47ea-bfae-4b4d1303106c-kube-api-access-csgwh\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605327 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rtn\" (UniqueName: \"kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605348 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-srv-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605470 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605928 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e66f95bf-fb5b-4682-b8b1-910d78519ba4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.605989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.603336 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.606760 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hggrh\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-kube-api-access-hggrh\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.607431 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.607507 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-audit-dir\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.607516 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.608088 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.609000 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.609179 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.609217 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-trusted-ca\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.609374 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.609500 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.610702 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-images\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.611670 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.612847 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-image-import-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.613165 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615018 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615138 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-dir\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615186 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-profile-collector-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615235 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615299 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-serving-ca\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615748 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-serving-cert\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.615780 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616088 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-etcd-client\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616177 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8rfg\" (UniqueName: \"kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616205 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-auth-proxy-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616232 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-config\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616257 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616280 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b62f849-06cf-430a-8b44-c0b8b5a652c6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616304 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-kube-api-access-hc6s6\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616327 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616424 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc8x\" (UniqueName: \"kubernetes.io/projected/e67cd89f-5768-44e0-9d5c-76e27ab585b7-kube-api-access-6xc8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616443 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616458 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-mountpoint-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616480 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616498 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67cd89f-5768-44e0-9d5c-76e27ab585b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616515 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63670946-1f60-490f-b79b-d4bacbc46803-machine-approver-tls\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616534 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd476af-578e-47ea-bfae-4b4d1303106c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616552 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-config\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616567 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616583 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-node-pullsecrets\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616600 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67ab93c3-9ab4-409c-b349-9032ff88e45b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616727 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616746 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f479b7-8c1f-4fd0-a106-c912c664579a-metrics-tls\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616765 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616780 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-plugins-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616802 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwck8\" (UniqueName: \"kubernetes.io/projected/67ab93c3-9ab4-409c-b349-9032ff88e45b-kube-api-access-gwck8\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616818 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8fw\" (UniqueName: \"kubernetes.io/projected/63670946-1f60-490f-b79b-d4bacbc46803-kube-api-access-zs8fw\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616837 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616858 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616881 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616907 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jxpg\" (UniqueName: \"kubernetes.io/projected/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-kube-api-access-4jxpg\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616927 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-registration-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616948 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6e9465f-5af8-48cb-b71b-3453e04acb1a-proxy-tls\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616969 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b62f849-06cf-430a-8b44-c0b8b5a652c6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.616991 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617011 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617032 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c1264d-0ebb-41e1-aacd-b68532d19b93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617054 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617075 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617094 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcsh2\" (UniqueName: \"kubernetes.io/projected/348e6db7-381a-4772-abbf-812a3b883c17-kube-api-access-bcsh2\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617891 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617922 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ms9m\" (UniqueName: \"kubernetes.io/projected/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-kube-api-access-7ms9m\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617941 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617964 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gckp\" (UniqueName: \"kubernetes.io/projected/c3096ffe-2960-4b33-9e8b-935b818b973c-kube-api-access-7gckp\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.617992 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618015 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618040 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-srv-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618064 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cbe9f19-4c05-4266-b4a8-53af41586325-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618098 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-config\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618116 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsqsc\" (UniqueName: \"kubernetes.io/projected/17f479b7-8c1f-4fd0-a106-c912c664579a-kube-api-access-qsqsc\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618136 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618145 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-serving-cert\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618151 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618209 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c1264d-0ebb-41e1-aacd-b68532d19b93-config\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618297 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618321 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81c1264d-0ebb-41e1-aacd-b68532d19b93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618371 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-serving-cert\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618398 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618421 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618479 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618507 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zp7\" (UniqueName: \"kubernetes.io/projected/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-kube-api-access-j7zp7\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618575 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-audit-policies\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.618646 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gn9k9"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.619239 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/67ab93c3-9ab4-409c-b349-9032ff88e45b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.619841 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.620078 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.621473 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63670946-1f60-490f-b79b-d4bacbc46803-auth-proxy-config\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.621890 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.622426 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-config\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.622447 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.623125 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ed4fee-4f78-4018-9a59-d8a98da1659f-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.624073 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.624440 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-serving-cert\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.624680 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625077 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625092 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625190 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625705 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625759 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-config\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.625805 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.626071 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.626580 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.626655 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.627280 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.627350 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ed4fee-4f78-4018-9a59-d8a98da1659f-node-pullsecrets\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.627676 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.628514 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.628623 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-config\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.628809 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.629213 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.629389 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-etcd-client\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.629823 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.629866 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.629964 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-c7bvc"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.630045 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.630722 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-proxy-tls\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.630831 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-encryption-config\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631013 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631214 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631352 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631553 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-encryption-config\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631593 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631665 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e66f95bf-fb5b-4682-b8b1-910d78519ba4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631697 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67ab93c3-9ab4-409c-b349-9032ff88e45b-serving-cert\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631929 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cbe9f19-4c05-4266-b4a8-53af41586325-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.631930 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-service-ca-bundle\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.632394 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cbe9f19-4c05-4266-b4a8-53af41586325-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.632408 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.632680 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.634348 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f479b7-8c1f-4fd0-a106-c912c664579a-metrics-tls\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.634563 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.634644 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/63670946-1f60-490f-b79b-d4bacbc46803-machine-approver-tls\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.634960 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ed4fee-4f78-4018-9a59-d8a98da1659f-serving-cert\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.635003 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.635057 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbd476af-578e-47ea-bfae-4b4d1303106c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.637024 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.637759 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.639573 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.640651 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lcq72"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.641816 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.642050 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.644027 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b24qw"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.646144 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.647986 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.649908 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kclz7"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.651384 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nq4xh"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.652785 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85fqn"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.656201 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.660478 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r8g5f"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.661006 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.661734 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8g5f"] Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.661836 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.669313 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b62f849-06cf-430a-8b44-c0b8b5a652c6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.680399 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.688640 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b62f849-06cf-430a-8b44-c0b8b5a652c6-config\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.701424 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720213 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720256 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rtn\" (UniqueName: \"kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720278 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-srv-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720323 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-profile-collector-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720363 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720384 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720405 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-mountpoint-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720439 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720472 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-plugins-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720507 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-registration-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720524 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6e9465f-5af8-48cb-b71b-3453e04acb1a-proxy-tls\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720540 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720563 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720586 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c1264d-0ebb-41e1-aacd-b68532d19b93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720613 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720662 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcsh2\" (UniqueName: \"kubernetes.io/projected/348e6db7-381a-4772-abbf-812a3b883c17-kube-api-access-bcsh2\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720687 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-srv-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720709 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gckp\" (UniqueName: \"kubernetes.io/projected/c3096ffe-2960-4b33-9e8b-935b818b973c-kube-api-access-7gckp\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720741 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720764 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720785 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c1264d-0ebb-41e1-aacd-b68532d19b93-config\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720808 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81c1264d-0ebb-41e1-aacd-b68532d19b93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720816 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-mountpoint-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720839 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720863 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfpk\" (UniqueName: \"kubernetes.io/projected/5ca423ac-de33-427a-b561-f04e6631b6d8-kube-api-access-wlfpk\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720886 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmcmj\" (UniqueName: \"kubernetes.io/projected/6914ad57-b1bc-4449-abe9-02e7183d92a9-kube-api-access-kmcmj\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720929 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-socket-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720950 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.720781 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-plugins-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721020 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-key\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721046 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-cabundle\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721070 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpknn\" (UniqueName: \"kubernetes.io/projected/a5b422d1-061f-4d69-9ed9-247f4930fe99-kube-api-access-fpknn\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721094 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzpzz\" (UniqueName: \"kubernetes.io/projected/b6e9465f-5af8-48cb-b71b-3453e04acb1a-kube-api-access-gzpzz\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721119 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfx9n\" (UniqueName: \"kubernetes.io/projected/c4aa91f8-086e-415b-aadc-da13d3d90ae9-kube-api-access-lfx9n\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721140 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721193 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmh8\" (UniqueName: \"kubernetes.io/projected/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-kube-api-access-gbmh8\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721241 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-images\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721263 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-registration-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721274 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721333 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nkm\" (UniqueName: \"kubernetes.io/projected/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-kube-api-access-f8nkm\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721356 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca423ac-de33-427a-b561-f04e6631b6d8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721390 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721420 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc27r\" (UniqueName: \"kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721460 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-csi-data-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721483 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2wb\" (UniqueName: \"kubernetes.io/projected/5b4e3579-1839-4de9-8e52-71ca7976b353-kube-api-access-4p2wb\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721527 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-socket-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721655 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/348e6db7-381a-4772-abbf-812a3b883c17-csi-data-dir\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.721754 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.722042 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.740402 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.751236 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67cd89f-5768-44e0-9d5c-76e27ab585b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.760163 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.769142 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67cd89f-5768-44e0-9d5c-76e27ab585b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.780700 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.820880 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.841158 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.861085 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.886318 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.900206 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.921211 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.944085 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.961347 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.981122 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 16:44:32 crc kubenswrapper[4694]: I0217 16:44:32.983078 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b6e9465f-5af8-48cb-b71b-3453e04acb1a-images\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.001445 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.020584 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.044515 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.057209 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6e9465f-5af8-48cb-b71b-3453e04acb1a-proxy-tls\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.060275 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.066021 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.080875 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.082077 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.100815 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.120922 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.141090 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.160759 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.180565 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.201528 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.222023 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.235960 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c1264d-0ebb-41e1-aacd-b68532d19b93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.240640 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.242206 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c1264d-0ebb-41e1-aacd-b68532d19b93-config\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.260739 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.264969 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.273169 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.274254 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-profile-collector-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.280723 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.300247 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.306410 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3096ffe-2960-4b33-9e8b-935b818b973c-srv-cert\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.321628 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.343008 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.356963 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.360539 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.364143 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b4e3579-1839-4de9-8e52-71ca7976b353-srv-cert\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.382215 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.397227 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ca423ac-de33-427a-b561-f04e6631b6d8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.402304 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.421283 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.432759 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.441081 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.471946 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.481412 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.482256 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.501832 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.514445 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.521785 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.538717 4694 request.go:700] Waited for 1.00563259s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.541671 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.560924 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.581058 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.600113 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.621376 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.627909 4694 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.628051 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config podName:cbd476af-578e-47ea-bfae-4b4d1303106c nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.12801939 +0000 UTC m=+141.885094754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config") pod "openshift-apiserver-operator-796bbdcf4f-9fhv7" (UID: "cbd476af-578e-47ea-bfae-4b4d1303106c") : failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.641097 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.660580 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.665923 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-key\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.680219 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.700747 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.702714 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6914ad57-b1bc-4449-abe9-02e7183d92a9-signing-cabundle\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.720335 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721123 4694 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721388 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs podName:c4aa91f8-086e-415b-aadc-da13d3d90ae9 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.221360341 +0000 UTC m=+141.978435705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs") pod "router-default-5444994796-t7nr4" (UID: "c4aa91f8-086e-415b-aadc-da13d3d90ae9") : failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721178 4694 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721678 4694 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721472 4694 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721740 4694 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.721947 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth podName:c4aa91f8-086e-415b-aadc-da13d3d90ae9 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.221681609 +0000 UTC m=+141.978756963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth") pod "router-default-5444994796-t7nr4" (UID: "c4aa91f8-086e-415b-aadc-da13d3d90ae9") : failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.722098 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs podName:a5b422d1-061f-4d69-9ed9-247f4930fe99 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.222082249 +0000 UTC m=+141.979157613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs") pod "multus-admission-controller-857f4d67dd-85fqn" (UID: "a5b422d1-061f-4d69-9ed9-247f4930fe99") : failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.722239 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate podName:c4aa91f8-086e-415b-aadc-da13d3d90ae9 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.222225463 +0000 UTC m=+141.979300827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate") pod "router-default-5444994796-t7nr4" (UID: "c4aa91f8-086e-415b-aadc-da13d3d90ae9") : failed to sync secret cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: E0217 16:44:33.722397 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle podName:c4aa91f8-086e-415b-aadc-da13d3d90ae9 nodeName:}" failed. No retries permitted until 2026-02-17 16:44:34.222383097 +0000 UTC m=+141.979458451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle") pod "router-default-5444994796-t7nr4" (UID: "c4aa91f8-086e-415b-aadc-da13d3d90ae9") : failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.740362 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.760536 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.780516 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.801346 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.821629 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.841229 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.861157 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.880321 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.900705 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.921138 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.941152 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.960630 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 16:44:33 crc kubenswrapper[4694]: I0217 16:44:33.981839 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.001201 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.020674 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.040688 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.061100 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.081785 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.101015 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.120585 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.140572 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.141133 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.160524 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.179866 4694 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.200083 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.220232 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.239734 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.242716 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.242915 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.242980 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.243005 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.243048 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.244198 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4aa91f8-086e-415b-aadc-da13d3d90ae9-service-ca-bundle\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.247865 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-metrics-certs\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.249135 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a5b422d1-061f-4d69-9ed9-247f4930fe99-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.249171 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-stats-auth\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.249989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4aa91f8-086e-415b-aadc-da13d3d90ae9-default-certificate\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.260131 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.315954 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl4lf\" (UniqueName: \"kubernetes.io/projected/d3ed4fee-4f78-4018-9a59-d8a98da1659f-kube-api-access-vl4lf\") pod \"apiserver-76f77b778f-wqqd4\" (UID: \"d3ed4fee-4f78-4018-9a59-d8a98da1659f\") " pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.334541 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66f95bf-fb5b-4682-b8b1-910d78519ba4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-whdfp\" (UID: \"e66f95bf-fb5b-4682-b8b1-910d78519ba4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.358714 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlndf\" (UniqueName: \"kubernetes.io/projected/f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2-kube-api-access-mlndf\") pod \"downloads-7954f5f757-nsrtk\" (UID: \"f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2\") " pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.378024 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgm7c\" (UniqueName: \"kubernetes.io/projected/b2d6ed98-7c31-42e0-8e85-c6cb28da320e-kube-api-access-jgm7c\") pod \"authentication-operator-69f744f599-bjgql\" (UID: \"b2d6ed98-7c31-42e0-8e85-c6cb28da320e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.394600 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgwh\" (UniqueName: \"kubernetes.io/projected/cbd476af-578e-47ea-bfae-4b4d1303106c-kube-api-access-csgwh\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.422876 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghtf5\" (UniqueName: \"kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5\") pod \"controller-manager-879f6c89f-vgfk2\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.439732 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hggrh\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-kube-api-access-hggrh\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.468749 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s2vh\" (UniqueName: \"kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh\") pod \"console-f9d7485db-896vh\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.477023 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4cp9\" (UniqueName: \"kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9\") pod \"route-controller-manager-6576b87f9c-7vtpj\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.479942 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.489164 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.497700 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zp7\" (UniqueName: \"kubernetes.io/projected/1e0a66e8-45d1-43ff-8a2b-c3614b8ac955-kube-api-access-j7zp7\") pod \"machine-config-controller-84d6567774-ljj7b\" (UID: \"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.504719 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.519553 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr45b\" (UniqueName: \"kubernetes.io/projected/333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a-kube-api-access-fr45b\") pod \"cluster-samples-operator-665b6dd947-l4wps\" (UID: \"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.535424 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.537004 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cbe9f19-4c05-4266-b4a8-53af41586325-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cx4lp\" (UID: \"6cbe9f19-4c05-4266-b4a8-53af41586325\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.539295 4694 request.go:700] Waited for 1.918785103s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.561264 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8rfg\" (UniqueName: \"kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg\") pod \"oauth-openshift-558db77b4-fmljb\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.579694 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ms9m\" (UniqueName: \"kubernetes.io/projected/5733b257-6fe2-4df1-aa83-4eaf3a84fdcc-kube-api-access-7ms9m\") pod \"machine-api-operator-5694c8668f-t44rx\" (UID: \"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.595179 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b62f849-06cf-430a-8b44-c0b8b5a652c6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vjhtp\" (UID: \"5b62f849-06cf-430a-8b44-c0b8b5a652c6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.615683 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc6s6\" (UniqueName: \"kubernetes.io/projected/d6283fbb-fd41-4d60-953a-26ee8e1c08e0-kube-api-access-hc6s6\") pod \"console-operator-58897d9998-96t75\" (UID: \"d6283fbb-fd41-4d60-953a-26ee8e1c08e0\") " pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.633204 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.639649 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc8x\" (UniqueName: \"kubernetes.io/projected/e67cd89f-5768-44e0-9d5c-76e27ab585b7-kube-api-access-6xc8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-c9hn8\" (UID: \"e67cd89f-5768-44e0-9d5c-76e27ab585b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.655896 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsqsc\" (UniqueName: \"kubernetes.io/projected/17f479b7-8c1f-4fd0-a106-c912c664579a-kube-api-access-qsqsc\") pod \"dns-operator-744455d44c-wq4dh\" (UID: \"17f479b7-8c1f-4fd0-a106-c912c664579a\") " pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.658407 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.667849 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.675401 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jxpg\" (UniqueName: \"kubernetes.io/projected/a8ed54f8-a641-4418-bf3f-59d8cb2ab7da-kube-api-access-4jxpg\") pod \"apiserver-7bbb656c7d-99c97\" (UID: \"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.680676 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.689724 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.705894 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nsrtk"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.712205 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.717832 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwck8\" (UniqueName: \"kubernetes.io/projected/67ab93c3-9ab4-409c-b349-9032ff88e45b-kube-api-access-gwck8\") pod \"openshift-config-operator-7777fb866f-mtmz6\" (UID: \"67ab93c3-9ab4-409c-b349-9032ff88e45b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.723651 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.732836 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.743084 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 16:44:34 crc kubenswrapper[4694]: W0217 16:44:34.756798 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4387c481_04e8_4060_affe_f9b6fc0b1406.slice/crio-ae9a2ef84a342ccdec952795ad9e6595dcb9a9b950cad3ef3dc9dd6410ab3f9f WatchSource:0}: Error finding container ae9a2ef84a342ccdec952795ad9e6595dcb9a9b950cad3ef3dc9dd6410ab3f9f: Status 404 returned error can't find the container with id ae9a2ef84a342ccdec952795ad9e6595dcb9a9b950cad3ef3dc9dd6410ab3f9f Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.761182 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.766018 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.774780 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.781645 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.796412 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.800195 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.811868 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wqqd4"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.812400 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.819139 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.820902 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.824701 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.832767 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.841903 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 16:44:34 crc kubenswrapper[4694]: W0217 16:44:34.860527 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66f95bf_fb5b_4682_b8b1_910d78519ba4.slice/crio-145b4a3a8ea060a1756b74ab87b9293fbaf4a246a1ae522f64a090c99c05af44 WatchSource:0}: Error finding container 145b4a3a8ea060a1756b74ab87b9293fbaf4a246a1ae522f64a090c99c05af44: Status 404 returned error can't find the container with id 145b4a3a8ea060a1756b74ab87b9293fbaf4a246a1ae522f64a090c99c05af44 Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.873891 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.881922 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rtn\" (UniqueName: \"kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn\") pod \"collect-profiles-29522430-mx6nw\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.905141 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gckp\" (UniqueName: \"kubernetes.io/projected/c3096ffe-2960-4b33-9e8b-935b818b973c-kube-api-access-7gckp\") pod \"olm-operator-6b444d44fb-26p7p\" (UID: \"c3096ffe-2960-4b33-9e8b-935b818b973c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.921134 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfpk\" (UniqueName: \"kubernetes.io/projected/5ca423ac-de33-427a-b561-f04e6631b6d8-kube-api-access-wlfpk\") pod \"package-server-manager-789f6589d5-27mc4\" (UID: \"5ca423ac-de33-427a-b561-f04e6631b6d8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.925213 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.941379 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81c1264d-0ebb-41e1-aacd-b68532d19b93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jltpb\" (UID: \"81c1264d-0ebb-41e1-aacd-b68532d19b93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.954105 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.964312 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t44rx"] Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.978590 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfx9n\" (UniqueName: \"kubernetes.io/projected/c4aa91f8-086e-415b-aadc-da13d3d90ae9-kube-api-access-lfx9n\") pod \"router-default-5444994796-t7nr4\" (UID: \"c4aa91f8-086e-415b-aadc-da13d3d90ae9\") " pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.984454 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:34 crc kubenswrapper[4694]: I0217 16:44:34.990121 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmcmj\" (UniqueName: \"kubernetes.io/projected/6914ad57-b1bc-4449-abe9-02e7183d92a9-kube-api-access-kmcmj\") pod \"service-ca-9c57cc56f-lcq72\" (UID: \"6914ad57-b1bc-4449-abe9-02e7183d92a9\") " pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:34.997252 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcsh2\" (UniqueName: \"kubernetes.io/projected/348e6db7-381a-4772-abbf-812a3b883c17-kube-api-access-bcsh2\") pod \"csi-hostpathplugin-gn9k9\" (UID: \"348e6db7-381a-4772-abbf-812a3b883c17\") " pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.000216 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:35 crc kubenswrapper[4694]: W0217 16:44:35.004518 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea272da_4da9_4f26_b66c_1aba9bbde6bc.slice/crio-8fa6649b2529f89760c1210aae6991ed89ebc761d6a4f8f2c732056c9e56df54 WatchSource:0}: Error finding container 8fa6649b2529f89760c1210aae6991ed89ebc761d6a4f8f2c732056c9e56df54: Status 404 returned error can't find the container with id 8fa6649b2529f89760c1210aae6991ed89ebc761d6a4f8f2c732056c9e56df54 Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.021850 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bjgql"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.028860 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.035057 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpknn\" (UniqueName: \"kubernetes.io/projected/a5b422d1-061f-4d69-9ed9-247f4930fe99-kube-api-access-fpknn\") pod \"multus-admission-controller-857f4d67dd-85fqn\" (UID: \"a5b422d1-061f-4d69-9ed9-247f4930fe99\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.043539 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmh8\" (UniqueName: \"kubernetes.io/projected/dcfad47b-0808-41a1-aa1f-f23b6eb262bd-kube-api-access-gbmh8\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqffd\" (UID: \"dcfad47b-0808-41a1-aa1f-f23b6eb262bd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.069457 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzpzz\" (UniqueName: \"kubernetes.io/projected/b6e9465f-5af8-48cb-b71b-3453e04acb1a-kube-api-access-gzpzz\") pod \"machine-config-operator-74547568cd-vxddf\" (UID: \"b6e9465f-5af8-48cb-b71b-3453e04acb1a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.088417 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.102803 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc27r\" (UniqueName: \"kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r\") pod \"marketplace-operator-79b997595-t8q2l\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.112918 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nkm\" (UniqueName: \"kubernetes.io/projected/cbc2bd3c-fb82-4835-9103-f7bf30e51f17-kube-api-access-f8nkm\") pod \"control-plane-machine-set-operator-78cbb6b69f-zjbh5\" (UID: \"cbc2bd3c-fb82-4835-9103-f7bf30e51f17\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.125048 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2wb\" (UniqueName: \"kubernetes.io/projected/5b4e3579-1839-4de9-8e52-71ca7976b353-kube-api-access-4p2wb\") pod \"catalog-operator-68c6474976-pj4d2\" (UID: \"5b4e3579-1839-4de9-8e52-71ca7976b353\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.141534 4694 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.141603 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config podName:cbd476af-578e-47ea-bfae-4b4d1303106c nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.141586851 +0000 UTC m=+143.898662165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config") pod "openshift-apiserver-operator-796bbdcf4f-9fhv7" (UID: "cbd476af-578e-47ea-bfae-4b4d1303106c") : failed to sync configmap cache: timed out waiting for the condition Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.142396 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.161877 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.163995 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.171457 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.178287 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.181467 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.187087 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.192327 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.197510 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8fw\" (UniqueName: \"kubernetes.io/projected/63670946-1f60-490f-b79b-d4bacbc46803-kube-api-access-zs8fw\") pod \"machine-approver-56656f9798-ttftl\" (UID: \"63670946-1f60-490f-b79b-d4bacbc46803\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.199825 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.202901 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.207782 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.214633 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.237681 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267116 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/321eb694-d16e-4909-a6ee-7d36a8be937c-serving-cert\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267214 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267254 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtlf\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267383 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8p2t\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-kube-api-access-g8p2t\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267429 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-apiservice-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267548 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267704 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66b75582-c991-495b-aa57-74aaac8319e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267774 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d017650-73fd-4db1-958d-7bca865a125b-tmpfs\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267837 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.267889 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:35.767869646 +0000 UTC m=+143.524944970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267915 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-serving-cert\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.267941 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321eb694-d16e-4909-a6ee-7d36a8be937c-config\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268105 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66b75582-c991-495b-aa57-74aaac8319e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268146 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268166 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268208 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-config\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268230 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268267 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-service-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268294 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzjpl\" (UniqueName: \"kubernetes.io/projected/321eb694-d16e-4909-a6ee-7d36a8be937c-kube-api-access-rzjpl\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268353 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268379 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-webhook-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268400 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268475 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvsbg\" (UniqueName: \"kubernetes.io/projected/3d017650-73fd-4db1-958d-7bca865a125b-kube-api-access-pvsbg\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268520 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrbm\" (UniqueName: \"kubernetes.io/projected/619cd21e-c5e1-4c89-9907-242d8e394477-kube-api-access-zkrbm\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268544 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncckj\" (UniqueName: \"kubernetes.io/projected/9457bb31-c044-484a-bc14-44e93af90889-kube-api-access-ncckj\") pod \"migrator-59844c95c7-6swq2\" (UID: \"9457bb31-c044-484a-bc14-44e93af90889\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268570 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-client\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.268688 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.269972 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.270824 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.305570 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.345144 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370243 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.370403 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:35.8703816 +0000 UTC m=+143.627456924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370623 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8p2t\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-kube-api-access-g8p2t\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370659 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-certs\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370720 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-apiservice-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370744 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370772 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkb2l\" (UniqueName: \"kubernetes.io/projected/cda0a671-138c-4f21-a089-d5a2ef8f0712-kube-api-access-wkb2l\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370798 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66b75582-c991-495b-aa57-74aaac8319e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370822 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d017650-73fd-4db1-958d-7bca865a125b-tmpfs\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370865 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda0a671-138c-4f21-a089-d5a2ef8f0712-cert\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370940 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370973 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-serving-cert\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.370996 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321eb694-d16e-4909-a6ee-7d36a8be937c-config\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371027 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70100d5b-32f2-496c-af2f-f08ce53c9148-metrics-tls\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371134 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66b75582-c991-495b-aa57-74aaac8319e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371175 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371196 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371239 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-config\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371289 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldwv\" (UniqueName: \"kubernetes.io/projected/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-kube-api-access-zldwv\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371336 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70100d5b-32f2-496c-af2f-f08ce53c9148-config-volume\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371401 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371437 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-service-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371459 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzjpl\" (UniqueName: \"kubernetes.io/projected/321eb694-d16e-4909-a6ee-7d36a8be937c-kube-api-access-rzjpl\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371558 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371651 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-webhook-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371674 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371737 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6nww\" (UniqueName: \"kubernetes.io/projected/70100d5b-32f2-496c-af2f-f08ce53c9148-kube-api-access-r6nww\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371774 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncckj\" (UniqueName: \"kubernetes.io/projected/9457bb31-c044-484a-bc14-44e93af90889-kube-api-access-ncckj\") pod \"migrator-59844c95c7-6swq2\" (UID: \"9457bb31-c044-484a-bc14-44e93af90889\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371806 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvsbg\" (UniqueName: \"kubernetes.io/projected/3d017650-73fd-4db1-958d-7bca865a125b-kube-api-access-pvsbg\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371827 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrbm\" (UniqueName: \"kubernetes.io/projected/619cd21e-c5e1-4c89-9907-242d8e394477-kube-api-access-zkrbm\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.371850 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-client\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.372032 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.372116 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-node-bootstrap-token\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.372142 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/321eb694-d16e-4909-a6ee-7d36a8be937c-serving-cert\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.372177 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtlf\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.372215 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.374732 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/321eb694-d16e-4909-a6ee-7d36a8be937c-config\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.375698 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-config\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.376378 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-service-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.376909 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.378756 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.381780 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-apiservice-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.382502 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-serving-cert\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.384382 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.385264 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-ca\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.385512 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/619cd21e-c5e1-4c89-9907-242d8e394477-etcd-client\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.385865 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3d017650-73fd-4db1-958d-7bca865a125b-tmpfs\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.388627 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:35.88858862 +0000 UTC m=+143.645663944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.392631 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66b75582-c991-495b-aa57-74aaac8319e2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.393704 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.408302 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.414386 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/321eb694-d16e-4909-a6ee-7d36a8be937c-serving-cert\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.423481 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d017650-73fd-4db1-958d-7bca865a125b-webhook-cert\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.432589 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/66b75582-c991-495b-aa57-74aaac8319e2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.437321 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvsbg\" (UniqueName: \"kubernetes.io/projected/3d017650-73fd-4db1-958d-7bca865a125b-kube-api-access-pvsbg\") pod \"packageserver-d55dfcdfc-rzdk7\" (UID: \"3d017650-73fd-4db1-958d-7bca865a125b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.450531 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474278 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474415 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-node-bootstrap-token\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474469 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-certs\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.474873 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:35.974851713 +0000 UTC m=+143.731927037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474922 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474952 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkb2l\" (UniqueName: \"kubernetes.io/projected/cda0a671-138c-4f21-a089-d5a2ef8f0712-kube-api-access-wkb2l\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.474972 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda0a671-138c-4f21-a089-d5a2ef8f0712-cert\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.475001 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70100d5b-32f2-496c-af2f-f08ce53c9148-metrics-tls\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.475037 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldwv\" (UniqueName: \"kubernetes.io/projected/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-kube-api-access-zldwv\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.475054 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70100d5b-32f2-496c-af2f-f08ce53c9148-config-volume\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.475091 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6nww\" (UniqueName: \"kubernetes.io/projected/70100d5b-32f2-496c-af2f-f08ce53c9148-kube-api-access-r6nww\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.478946 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70100d5b-32f2-496c-af2f-f08ce53c9148-config-volume\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.481399 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b"] Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.481770 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:35.981753377 +0000 UTC m=+143.738828751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.497002 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrbm\" (UniqueName: \"kubernetes.io/projected/619cd21e-c5e1-4c89-9907-242d8e394477-kube-api-access-zkrbm\") pod \"etcd-operator-b45778765-b24qw\" (UID: \"619cd21e-c5e1-4c89-9907-242d8e394477\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.500694 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmljb"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.507178 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.507243 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-wq4dh"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.515785 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.523510 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-96t75"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.527942 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cda0a671-138c-4f21-a089-d5a2ef8f0712-cert\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.528129 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-certs\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.528638 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-node-bootstrap-token\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.533690 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzjpl\" (UniqueName: \"kubernetes.io/projected/321eb694-d16e-4909-a6ee-7d36a8be937c-kube-api-access-rzjpl\") pod \"service-ca-operator-777779d784-kclz7\" (UID: \"321eb694-d16e-4909-a6ee-7d36a8be937c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.535444 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8p2t\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-kube-api-access-g8p2t\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.536642 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtlf\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.542434 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70100d5b-32f2-496c-af2f-f08ce53c9148-metrics-tls\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.552662 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.555545 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncckj\" (UniqueName: \"kubernetes.io/projected/9457bb31-c044-484a-bc14-44e93af90889-kube-api-access-ncckj\") pod \"migrator-59844c95c7-6swq2\" (UID: \"9457bb31-c044-484a-bc14-44e93af90889\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.566271 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66b75582-c991-495b-aa57-74aaac8319e2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qn6jr\" (UID: \"66b75582-c991-495b-aa57-74aaac8319e2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.570638 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.575680 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.576049 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.576163 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.076143785 +0000 UTC m=+143.833219109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.576363 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.576722 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.076710239 +0000 UTC m=+143.833785583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.594088 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.595863 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6nww\" (UniqueName: \"kubernetes.io/projected/70100d5b-32f2-496c-af2f-f08ce53c9148-kube-api-access-r6nww\") pod \"dns-default-nq4xh\" (UID: \"70100d5b-32f2-496c-af2f-f08ce53c9148\") " pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.626857 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldwv\" (UniqueName: \"kubernetes.io/projected/f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee-kube-api-access-zldwv\") pod \"machine-config-server-c7bvc\" (UID: \"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee\") " pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.635213 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.641340 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" event={"ID":"63670946-1f60-490f-b79b-d4bacbc46803","Type":"ContainerStarted","Data":"34b515cf77d420e2253ce7d9b125a327d2d55944f8c581c2a1ab3ff309f6150f"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.643147 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkb2l\" (UniqueName: \"kubernetes.io/projected/cda0a671-138c-4f21-a089-d5a2ef8f0712-kube-api-access-wkb2l\") pod \"ingress-canary-r8g5f\" (UID: \"cda0a671-138c-4f21-a089-d5a2ef8f0712\") " pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.648118 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c7bvc" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.657435 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8g5f" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.658862 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nsrtk" event={"ID":"f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2","Type":"ContainerStarted","Data":"9656a247baf35dca561cbbbabbc8d1d3b195ff5352b974395821adb5365d6205"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.658905 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nsrtk" event={"ID":"f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2","Type":"ContainerStarted","Data":"0929af382dd809ef60eedfb05e6a3c711c5f206738fb307615fb7afae68a7428"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.660879 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.661557 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" event={"ID":"6cbe9f19-4c05-4266-b4a8-53af41586325","Type":"ContainerStarted","Data":"d2ce165e3b25231005709e9b4ce8491cb6e82a71a62eddc5d9c460156c9ba5d9"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.668194 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" event={"ID":"b2d6ed98-7c31-42e0-8e85-c6cb28da320e","Type":"ContainerStarted","Data":"91169d41ffd325c961cdacc2084282f4272e84af0a2ea594ca5b3433e7d50dc2"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.668235 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" event={"ID":"b2d6ed98-7c31-42e0-8e85-c6cb28da320e","Type":"ContainerStarted","Data":"92ad8bd48d502282ff606af69fee17780c3a1598edcffb64d02bfa86c0e85e81"} Feb 17 16:44:35 crc kubenswrapper[4694]: W0217 16:44:35.676765 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f479b7_8c1f_4fd0_a106_c912c664579a.slice/crio-7a651fa3c1f9d06fcb8114260cc5e54f69f9d650daf80994dff9482c547c1af1 WatchSource:0}: Error finding container 7a651fa3c1f9d06fcb8114260cc5e54f69f9d650daf80994dff9482c547c1af1: Status 404 returned error can't find the container with id 7a651fa3c1f9d06fcb8114260cc5e54f69f9d650daf80994dff9482c547c1af1 Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.677148 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.677460 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.177446438 +0000 UTC m=+143.934521762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.678317 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-896vh" event={"ID":"4387c481-04e8-4060-affe-f9b6fc0b1406","Type":"ContainerStarted","Data":"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.678347 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-896vh" event={"ID":"4387c481-04e8-4060-affe-f9b6fc0b1406","Type":"ContainerStarted","Data":"ae9a2ef84a342ccdec952795ad9e6595dcb9a9b950cad3ef3dc9dd6410ab3f9f"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.710060 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" event={"ID":"eea272da-4da9-4f26-b66c-1aba9bbde6bc","Type":"ContainerStarted","Data":"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.710458 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" event={"ID":"eea272da-4da9-4f26-b66c-1aba9bbde6bc","Type":"ContainerStarted","Data":"8fa6649b2529f89760c1210aae6991ed89ebc761d6a4f8f2c732056c9e56df54"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.710851 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.731529 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" event={"ID":"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc","Type":"ContainerStarted","Data":"ff185e39d30e166541a515fa1ddf2b025c3987efbe5c213e4249e5eaee16658c"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.731580 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" event={"ID":"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc","Type":"ContainerStarted","Data":"8bb790d7aa54e4c1ff6ef391c95af0ff1cf699ced590a9d63fadc8c1730c6b50"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.739912 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.754576 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" event={"ID":"d3ed4fee-4f78-4018-9a59-d8a98da1659f","Type":"ContainerStarted","Data":"9f9c3344d72c7513e827ac59f91d1c4b60d08f045cacdb1051451f9f762befdd"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.765927 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t7nr4" event={"ID":"c4aa91f8-086e-415b-aadc-da13d3d90ae9","Type":"ContainerStarted","Data":"016f3f317037b892dc140fb1e8f087a72d7992fbc82cd4109dada45380dad189"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.766281 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t7nr4" event={"ID":"c4aa91f8-086e-415b-aadc-da13d3d90ae9","Type":"ContainerStarted","Data":"a7f383b2949678b954f876f06edff08523ca7a2cb24fb3552887b14cabf99df7"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.787067 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.787146 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.788015 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" event={"ID":"e66f95bf-fb5b-4682-b8b1-910d78519ba4","Type":"ContainerStarted","Data":"75b895bd54557918121a6fd5a739a03d9c8f3687a335b6dd093dc045a1d1a26a"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.788057 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" event={"ID":"e66f95bf-fb5b-4682-b8b1-910d78519ba4","Type":"ContainerStarted","Data":"145b4a3a8ea060a1756b74ab87b9293fbaf4a246a1ae522f64a090c99c05af44"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.789653 4694 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vgfk2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.789724 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.790422 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.790730 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.290707933 +0000 UTC m=+144.047783257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.794990 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" event={"ID":"d1c76767-8f16-4926-b632-8611bc27de87","Type":"ContainerStarted","Data":"aff6e7835dc8589138a916afdf29efc0819f7f4adbaecc79097c7cdb57bebad4"} Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.838380 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp"] Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.891595 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.891800 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.39176341 +0000 UTC m=+144.148838734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.891933 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:35 crc kubenswrapper[4694]: E0217 16:44:35.892558 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.39254099 +0000 UTC m=+144.149616314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:35 crc kubenswrapper[4694]: I0217 16:44:35.985080 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:35.987749 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:35.987816 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:35.994107 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:35.994248 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.494214522 +0000 UTC m=+144.251289846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:35.994389 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:35.994721 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.494710394 +0000 UTC m=+144.251785718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.096235 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.098076 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.598057869 +0000 UTC m=+144.355133183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.200395 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.200854 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.201153 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.701141307 +0000 UTC m=+144.458216641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.204956 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd476af-578e-47ea-bfae-4b4d1303106c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9fhv7\" (UID: \"cbd476af-578e-47ea-bfae-4b4d1303106c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.252038 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gn9k9"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.286468 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.294314 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.305131 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.305278 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.805249191 +0000 UTC m=+144.562324515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.305341 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.305823 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.805816975 +0000 UTC m=+144.562892299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.316831 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.341044 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.351344 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" Feb 17 16:44:36 crc kubenswrapper[4694]: W0217 16:44:36.382839 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc2bd3c_fb82_4835_9103_f7bf30e51f17.slice/crio-e1440956b6799cc5969f597c7761e7df93b2fac508b6d05da89b634f8ffd83eb WatchSource:0}: Error finding container e1440956b6799cc5969f597c7761e7df93b2fac508b6d05da89b634f8ffd83eb: Status 404 returned error can't find the container with id e1440956b6799cc5969f597c7761e7df93b2fac508b6d05da89b634f8ffd83eb Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.406556 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.406684 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.906659647 +0000 UTC m=+144.663734971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.407099 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.407464 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:36.907452307 +0000 UTC m=+144.664527631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: W0217 16:44:36.413653 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6e9465f_5af8_48cb_b71b_3453e04acb1a.slice/crio-d0bb490797b78b4e6e2ec5aafac0bf4849d08ddaa1fc97403c12e5018134eea8 WatchSource:0}: Error finding container d0bb490797b78b4e6e2ec5aafac0bf4849d08ddaa1fc97403c12e5018134eea8: Status 404 returned error can't find the container with id d0bb490797b78b4e6e2ec5aafac0bf4849d08ddaa1fc97403c12e5018134eea8 Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.507890 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.508193 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.008178345 +0000 UTC m=+144.765253669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.536563 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.548447 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.553125 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nsrtk" podStartSLOduration=124.553105391 podStartE2EDuration="2m4.553105391s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:36.549990193 +0000 UTC m=+144.307065517" watchObservedRunningTime="2026-02-17 16:44:36.553105391 +0000 UTC m=+144.310180715" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.568457 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.586298 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-896vh" podStartSLOduration=124.586278261 podStartE2EDuration="2m4.586278261s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:36.585132632 +0000 UTC m=+144.342207956" watchObservedRunningTime="2026-02-17 16:44:36.586278261 +0000 UTC m=+144.343353585" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.609055 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.609467 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.109452176 +0000 UTC m=+144.866527500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.678298 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" podStartSLOduration=124.678257487 podStartE2EDuration="2m4.678257487s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:36.621672175 +0000 UTC m=+144.378747509" watchObservedRunningTime="2026-02-17 16:44:36.678257487 +0000 UTC m=+144.435332811" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.678897 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bjgql" podStartSLOduration=124.678887743 podStartE2EDuration="2m4.678887743s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:36.665011822 +0000 UTC m=+144.422087146" watchObservedRunningTime="2026-02-17 16:44:36.678887743 +0000 UTC m=+144.435963077" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.724523 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.729594 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kclz7"] Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.731035 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.231005581 +0000 UTC m=+144.988080905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.741810 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.743217 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.243184679 +0000 UTC m=+145.000260003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.748334 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.748381 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.755293 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-lcq72"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.762858 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.793796 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-85fqn"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.796781 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.843390 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.843930 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.343890947 +0000 UTC m=+145.100966271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.848723 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b24qw"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.854632 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" event={"ID":"348e6db7-381a-4772-abbf-812a3b883c17","Type":"ContainerStarted","Data":"91d1f1d3afe8268420757ceb63000211ee2df01fe09f121868551b698aa7de31"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.861949 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" event={"ID":"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a","Type":"ContainerStarted","Data":"3b7b37684fb259947913bc0443259408e482c1eb61ca37650cadd36ef7435f44"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.862004 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" event={"ID":"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a","Type":"ContainerStarted","Data":"0193643aa145716c782e7d71e7277d6a726242d97a484a4d63ae40dc2e96d883"} Feb 17 16:44:36 crc kubenswrapper[4694]: W0217 16:44:36.864625 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda0a671_138c_4f21_a089_d5a2ef8f0712.slice/crio-ff412f8de05ee1cebb0dacc84f68f8620f66c3c981048f846b62daee3ceb696b WatchSource:0}: Error finding container ff412f8de05ee1cebb0dacc84f68f8620f66c3c981048f846b62daee3ceb696b: Status 404 returned error can't find the container with id ff412f8de05ee1cebb0dacc84f68f8620f66c3c981048f846b62daee3ceb696b Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.867468 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t7nr4" podStartSLOduration=123.867438733 podStartE2EDuration="2m3.867438733s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:36.858333483 +0000 UTC m=+144.615408807" watchObservedRunningTime="2026-02-17 16:44:36.867438733 +0000 UTC m=+144.624514067" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.871224 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" event={"ID":"d1c76767-8f16-4926-b632-8611bc27de87","Type":"ContainerStarted","Data":"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.871261 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.874457 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c7bvc" event={"ID":"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee","Type":"ContainerStarted","Data":"c52ce0c130b4a5c7c21871eec885e4d302fc315c6de9e8c0612d1f7993d271f3"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.874504 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c7bvc" event={"ID":"f567ed6a-1cbc-4a2b-87bd-727d1a3d22ee","Type":"ContainerStarted","Data":"a0edb716e39de465bd405a1bc452d063e3029d79706e30963c2dcd7803ff06ff"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.878183 4694 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7vtpj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.878362 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" podUID="d1c76767-8f16-4926-b632-8611bc27de87" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.891563 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" event={"ID":"5b62f849-06cf-430a-8b44-c0b8b5a652c6","Type":"ContainerStarted","Data":"1f9731dc208ae9075a27e0e56fc0705419f94b00314a203f29edc59a25bd79bd"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.894644 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nq4xh"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.944896 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:36 crc kubenswrapper[4694]: E0217 16:44:36.945599 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.44558229 +0000 UTC m=+145.202657614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.972400 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8g5f"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.973897 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.973999 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr"] Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.982227 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" event={"ID":"552639c4-d873-44a5-bbf1-0ada555d4d92","Type":"ContainerStarted","Data":"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.982338 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" event={"ID":"552639c4-d873-44a5-bbf1-0ada555d4d92","Type":"ContainerStarted","Data":"7315024aa3f3f636da74751676e8a8ec75f1b741724931a56e3a50f95916afda"} Feb 17 16:44:36 crc kubenswrapper[4694]: I0217 16:44:36.983278 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.003726 4694 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fmljb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.003781 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.005712 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" event={"ID":"04ac4a19-2aa4-44da-ac5d-4df6622094b2","Type":"ContainerStarted","Data":"ab77d87d46992a632a5fae8f51b9d869f8e1278bbd12efbe9a1dd9d064c125f3"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.006930 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2"] Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.009437 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:37 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:37 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:37 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.009502 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.013316 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" event={"ID":"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955","Type":"ContainerStarted","Data":"170d7e04610f1e825d9a8b10238776738296d097176f98821425ef4a555132af"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.013357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" event={"ID":"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955","Type":"ContainerStarted","Data":"4ded38ff20611d098a8a5247ab0c39dd829609f99ee4eea4d91b1ce65ad9a3a2"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.017699 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" event={"ID":"63670946-1f60-490f-b79b-d4bacbc46803","Type":"ContainerStarted","Data":"d46916bd19d7d372fdf01aa470b0dbb7ecc8bc02622ba5257961109e69956fcf"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.028002 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" event={"ID":"5733b257-6fe2-4df1-aa83-4eaf3a84fdcc","Type":"ContainerStarted","Data":"36ddc99bc42b9a25bfa4b92c8f293cdfdf380d76ee00a42e6424309d3e6e411c"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.036479 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" event={"ID":"3d017650-73fd-4db1-958d-7bca865a125b","Type":"ContainerStarted","Data":"c01503b115113f6db19e7b042123d6bf3300355eccf92443ccf225fecbbc3b4b"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.042897 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-96t75" event={"ID":"d6283fbb-fd41-4d60-953a-26ee8e1c08e0","Type":"ContainerStarted","Data":"4a2af2d2b1262ded0c1e32ecc0cd01af30a1827fbaf1e984659c975e82e44050"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.042937 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-96t75" event={"ID":"d6283fbb-fd41-4d60-953a-26ee8e1c08e0","Type":"ContainerStarted","Data":"cda0c2bc86ee5b3f7a9853c3fe5352ba8aaa45d3f47075905be514f9d85ff3d5"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.045551 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.046794 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.047540 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.547514759 +0000 UTC m=+145.304590083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.048599 4694 patch_prober.go:28] interesting pod/console-operator-58897d9998-96t75 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.048699 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-96t75" podUID="d6283fbb-fd41-4d60-953a-26ee8e1c08e0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.053342 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" event={"ID":"b6e9465f-5af8-48cb-b71b-3453e04acb1a","Type":"ContainerStarted","Data":"d0bb490797b78b4e6e2ec5aafac0bf4849d08ddaa1fc97403c12e5018134eea8"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.085358 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" event={"ID":"67ab93c3-9ab4-409c-b349-9032ff88e45b","Type":"ContainerStarted","Data":"4b03b669ef3119421816cbc580d002abd955d200c116c5baa70907eb1b636520"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.092810 4694 generic.go:334] "Generic (PLEG): container finished" podID="a8ed54f8-a641-4418-bf3f-59d8cb2ab7da" containerID="2fb99c1dc81aec32b4abfa1750e69e99b6852fd0453fd8cc5db43f2d1803b27f" exitCode=0 Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.092870 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" event={"ID":"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da","Type":"ContainerDied","Data":"2fb99c1dc81aec32b4abfa1750e69e99b6852fd0453fd8cc5db43f2d1803b27f"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.092895 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" event={"ID":"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da","Type":"ContainerStarted","Data":"37f647ed8781215f4fe728785fd0c3bc2ef9b566a38313fcf0496addcb9587d0"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.102118 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" event={"ID":"cbc2bd3c-fb82-4835-9103-f7bf30e51f17","Type":"ContainerStarted","Data":"e1440956b6799cc5969f597c7761e7df93b2fac508b6d05da89b634f8ffd83eb"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.106885 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" event={"ID":"dcfad47b-0808-41a1-aa1f-f23b6eb262bd","Type":"ContainerStarted","Data":"3b67bb2a86e24514c06307aad9597f94394268af60eea7885488ba2e1404134e"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.121904 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" event={"ID":"6cbe9f19-4c05-4266-b4a8-53af41586325","Type":"ContainerStarted","Data":"91fad3f0919378495c717ae73fae4680f9bf01c8768afac7a0356f13b4c1e3c9"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.126644 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" event={"ID":"5ca423ac-de33-427a-b561-f04e6631b6d8","Type":"ContainerStarted","Data":"38e6f770ee96c17c920d069fbe69991e20932ba12185bfeb16dddbec792a0d07"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.148116 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.150482 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.650470703 +0000 UTC m=+145.407546027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.164780 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" event={"ID":"e67cd89f-5768-44e0-9d5c-76e27ab585b7","Type":"ContainerStarted","Data":"da451aa7e34882d459925385415d3bcbacf86db544fd8edd1ebc16e69660a556"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.165135 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" event={"ID":"e67cd89f-5768-44e0-9d5c-76e27ab585b7","Type":"ContainerStarted","Data":"f2b5bb09e7e82a936b5c6366e1c7c525b99ad8530447c195e696e2c4dbc85a8b"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.184495 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" event={"ID":"17f479b7-8c1f-4fd0-a106-c912c664579a","Type":"ContainerStarted","Data":"7a651fa3c1f9d06fcb8114260cc5e54f69f9d650daf80994dff9482c547c1af1"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.202348 4694 generic.go:334] "Generic (PLEG): container finished" podID="d3ed4fee-4f78-4018-9a59-d8a98da1659f" containerID="29f22d8766d61e9e5cebbaf87bdcf86da8a26800578610c38933f248ad543865" exitCode=0 Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.203784 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" event={"ID":"d3ed4fee-4f78-4018-9a59-d8a98da1659f","Type":"ContainerDied","Data":"29f22d8766d61e9e5cebbaf87bdcf86da8a26800578610c38933f248ad543865"} Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.204882 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.204921 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.233394 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.249254 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.252478 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.752441963 +0000 UTC m=+145.509517287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.252853 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.256020 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.756009023 +0000 UTC m=+145.513084347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.259321 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-whdfp" podStartSLOduration=124.259300877 podStartE2EDuration="2m4.259300877s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.219861189 +0000 UTC m=+144.976936503" watchObservedRunningTime="2026-02-17 16:44:37.259300877 +0000 UTC m=+145.016376211" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.353566 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.354348 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.854333791 +0000 UTC m=+145.611409115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.473719 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.474395 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:37.974382928 +0000 UTC m=+145.731458252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.504260 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-96t75" podStartSLOduration=125.504241603 podStartE2EDuration="2m5.504241603s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.495852091 +0000 UTC m=+145.252927425" watchObservedRunningTime="2026-02-17 16:44:37.504241603 +0000 UTC m=+145.261316927" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.575309 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.575749 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.075734112 +0000 UTC m=+145.832809436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.618850 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" podStartSLOduration=124.618834553 podStartE2EDuration="2m4.618834553s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.581377645 +0000 UTC m=+145.338452959" watchObservedRunningTime="2026-02-17 16:44:37.618834553 +0000 UTC m=+145.375909877" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.619898 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" podStartSLOduration=124.619893859 podStartE2EDuration="2m4.619893859s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.619016757 +0000 UTC m=+145.376092081" watchObservedRunningTime="2026-02-17 16:44:37.619893859 +0000 UTC m=+145.376969183" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.680651 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.682753 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.182723839 +0000 UTC m=+145.939799173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.726122 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-c7bvc" podStartSLOduration=5.726100506 podStartE2EDuration="5.726100506s" podCreationTimestamp="2026-02-17 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.722044294 +0000 UTC m=+145.479119628" watchObservedRunningTime="2026-02-17 16:44:37.726100506 +0000 UTC m=+145.483175830" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.779903 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cx4lp" podStartSLOduration=125.779878367 podStartE2EDuration="2m5.779878367s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.774994773 +0000 UTC m=+145.532070097" watchObservedRunningTime="2026-02-17 16:44:37.779878367 +0000 UTC m=+145.536953711" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.792180 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.792815 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.292774083 +0000 UTC m=+146.049849407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.819921 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" podStartSLOduration=125.819904279 podStartE2EDuration="2m5.819904279s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.819403017 +0000 UTC m=+145.576478341" watchObservedRunningTime="2026-02-17 16:44:37.819904279 +0000 UTC m=+145.576979603" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.877251 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-c9hn8" podStartSLOduration=124.87723136 podStartE2EDuration="2m4.87723136s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.8468265 +0000 UTC m=+145.603901824" watchObservedRunningTime="2026-02-17 16:44:37.87723136 +0000 UTC m=+145.634306684" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.894430 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.894871 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.394856426 +0000 UTC m=+146.151931750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.922253 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t44rx" podStartSLOduration=124.922231088 podStartE2EDuration="2m4.922231088s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:37.918179316 +0000 UTC m=+145.675254640" watchObservedRunningTime="2026-02-17 16:44:37.922231088 +0000 UTC m=+145.679306422" Feb 17 16:44:37 crc kubenswrapper[4694]: I0217 16:44:37.996032 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:37 crc kubenswrapper[4694]: E0217 16:44:37.996694 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.496677492 +0000 UTC m=+146.253752816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.011770 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:38 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:38 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:38 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.011808 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.097568 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.097934 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.597922743 +0000 UTC m=+146.354998057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.198283 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.198789 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.698773464 +0000 UTC m=+146.455848788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.230847 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" event={"ID":"63670946-1f60-490f-b79b-d4bacbc46803","Type":"ContainerStarted","Data":"842fbed3a59578461b604fbf704cb1bfb5c7dee40e376a54d1da88d664fda0ee"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.252970 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ttftl" podStartSLOduration=126.252896064 podStartE2EDuration="2m6.252896064s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.252659678 +0000 UTC m=+146.009735002" watchObservedRunningTime="2026-02-17 16:44:38.252896064 +0000 UTC m=+146.009971388" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.260299 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq4xh" event={"ID":"70100d5b-32f2-496c-af2f-f08ce53c9148","Type":"ContainerStarted","Data":"bd39cd46ca7952c7fd328c4181c7bdba4af8632ac9f890ef232722b20c0644ae"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.304043 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" event={"ID":"5ca423ac-de33-427a-b561-f04e6631b6d8","Type":"ContainerStarted","Data":"cf851c148a3c153fb72a63c73b607d0f0bde99144dcb38dcdd8b7ebb36cc8366"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.307867 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.308302 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.808290115 +0000 UTC m=+146.565365439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.334156 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" event={"ID":"a5b422d1-061f-4d69-9ed9-247f4930fe99","Type":"ContainerStarted","Data":"4a4251eb8988dddbace028f765b06ad9ca315dbe1ad297895c6293cbe39bcc43"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.352454 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" event={"ID":"6914ad57-b1bc-4449-abe9-02e7183d92a9","Type":"ContainerStarted","Data":"3020bc170ccea448c3ed8d683de13d9f1783e7914d04ef60ce1dc57bf6ea746f"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.352510 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" event={"ID":"6914ad57-b1bc-4449-abe9-02e7183d92a9","Type":"ContainerStarted","Data":"4a64de588f483876cdb14240017831974e8e0480e19bbbe6805de775354a4eea"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.370050 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" event={"ID":"5b62f849-06cf-430a-8b44-c0b8b5a652c6","Type":"ContainerStarted","Data":"09deec4cf971e39c3b821e55d80c23bee89608498139f4183b74aea1938a49db"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.409091 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.410143 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:38.910125681 +0000 UTC m=+146.667201005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.436230 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" event={"ID":"1e0a66e8-45d1-43ff-8a2b-c3614b8ac955","Type":"ContainerStarted","Data":"f1a8078fd00a3175dbfaecc1d337bf580bda76494140c056fb4bd64b97f014ed"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.453414 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerStarted","Data":"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.453827 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerStarted","Data":"b1f872531a34ead573707e214f0c8bb471608d5bf02b07c15bebccba41f21764"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.455172 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.461264 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-lcq72" podStartSLOduration=125.461247335 podStartE2EDuration="2m5.461247335s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.423172712 +0000 UTC m=+146.180248046" watchObservedRunningTime="2026-02-17 16:44:38.461247335 +0000 UTC m=+146.218322659" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.473216 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vjhtp" podStartSLOduration=125.473193027 podStartE2EDuration="2m5.473193027s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.460456695 +0000 UTC m=+146.217532029" watchObservedRunningTime="2026-02-17 16:44:38.473193027 +0000 UTC m=+146.230268351" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.481880 4694 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8q2l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.481941 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.513988 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.515477 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.015457896 +0000 UTC m=+146.772533220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.525378 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" event={"ID":"b6e9465f-5af8-48cb-b71b-3453e04acb1a","Type":"ContainerStarted","Data":"d8c82a044c983b23d3d68c315e4cd1aa66f2024c53f76b8be46f6f040365379e"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.525419 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" event={"ID":"b6e9465f-5af8-48cb-b71b-3453e04acb1a","Type":"ContainerStarted","Data":"c49ea404a415264d6c437779993e839ac1bcc188403a6ba9a18542993fbcd1fa"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.555787 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podStartSLOduration=125.555770376 podStartE2EDuration="2m5.555770376s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.552104363 +0000 UTC m=+146.309179687" watchObservedRunningTime="2026-02-17 16:44:38.555770376 +0000 UTC m=+146.312845700" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.556819 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" event={"ID":"321eb694-d16e-4909-a6ee-7d36a8be937c","Type":"ContainerStarted","Data":"9f05c049fa57d68be490d3f68a05b8dfefd0d43ea9aa92394e5e0cfa45fe6174"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.556853 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" event={"ID":"321eb694-d16e-4909-a6ee-7d36a8be937c","Type":"ContainerStarted","Data":"027d24024eceb9d79cc071bcc43090b6928ec6207b1aceefa1f44f8b31f6e99d"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.568245 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" event={"ID":"c3096ffe-2960-4b33-9e8b-935b818b973c","Type":"ContainerStarted","Data":"90d22cea68fe607666996c20795f42c451db1faccce82bbd79991fbc3f3e04a1"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.568295 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" event={"ID":"c3096ffe-2960-4b33-9e8b-935b818b973c","Type":"ContainerStarted","Data":"f10985acaf40041f1bedf41f6e294089aa5a95498a2b146bb07c377b7b3264a8"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.569106 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.583739 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" event={"ID":"81c1264d-0ebb-41e1-aacd-b68532d19b93","Type":"ContainerStarted","Data":"943e9d633398f85e8f2c9091b6ba09c2c8d91cd0a592116d8578148f40da5b06"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.588301 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zjbh5" event={"ID":"cbc2bd3c-fb82-4835-9103-f7bf30e51f17","Type":"ContainerStarted","Data":"c9fdf2420be0313d226c00c660319c00999cd99a07fad12854105b8468119b5c"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.604144 4694 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-26p7p container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.604225 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" podUID="c3096ffe-2960-4b33-9e8b-935b818b973c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.605488 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ljj7b" podStartSLOduration=125.605478114 podStartE2EDuration="2m5.605478114s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.604424147 +0000 UTC m=+146.361499461" watchObservedRunningTime="2026-02-17 16:44:38.605478114 +0000 UTC m=+146.362553448" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.615396 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.616411 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.11639357 +0000 UTC m=+146.873468894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.624968 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8g5f" event={"ID":"cda0a671-138c-4f21-a089-d5a2ef8f0712","Type":"ContainerStarted","Data":"bc8f0c3ff2703fe8ae071ecc78b9ce34ca3fd18348a2c3254a3f02b3bfe576c4"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.625028 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8g5f" event={"ID":"cda0a671-138c-4f21-a089-d5a2ef8f0712","Type":"ContainerStarted","Data":"ff412f8de05ee1cebb0dacc84f68f8620f66c3c981048f846b62daee3ceb696b"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.652035 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" event={"ID":"17f479b7-8c1f-4fd0-a106-c912c664579a","Type":"ContainerStarted","Data":"2f1a60b38f22a7a89e863e62afd45209a2033d81e275c734fafc5a5cf13681ef"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.652383 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" event={"ID":"17f479b7-8c1f-4fd0-a106-c912c664579a","Type":"ContainerStarted","Data":"4872f00da4d0af23f8006070c8f0bcfcd60bb710633743aa6841c2ecbfb87571"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.656157 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" event={"ID":"66b75582-c991-495b-aa57-74aaac8319e2","Type":"ContainerStarted","Data":"64b3e87f69c08f6e207d12038209f5072f4c5936f515fbe2abc552bbdb551e79"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.669723 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" podStartSLOduration=125.669703288 podStartE2EDuration="2m5.669703288s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.636363655 +0000 UTC m=+146.393438979" watchObservedRunningTime="2026-02-17 16:44:38.669703288 +0000 UTC m=+146.426778612" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.696838 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" event={"ID":"3d017650-73fd-4db1-958d-7bca865a125b","Type":"ContainerStarted","Data":"1034f1d5384d750321c92b3a047de104d4408b6ab9b29a5f021c0ae62bdaf266"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.697727 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.710853 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vxddf" podStartSLOduration=125.710829329 podStartE2EDuration="2m5.710829329s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.671222457 +0000 UTC m=+146.428297781" watchObservedRunningTime="2026-02-17 16:44:38.710829329 +0000 UTC m=+146.467904653" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.712692 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" podStartSLOduration=125.712676596 podStartE2EDuration="2m5.712676596s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.709968387 +0000 UTC m=+146.467043711" watchObservedRunningTime="2026-02-17 16:44:38.712676596 +0000 UTC m=+146.469751920" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.717356 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.718916 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.218898453 +0000 UTC m=+146.975973877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.720717 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" event={"ID":"619cd21e-c5e1-4c89-9907-242d8e394477","Type":"ContainerStarted","Data":"2306b31fee135394292d9bd3bb4832af73e5f578914e0bfc6081383813b2aa5e"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.720755 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" event={"ID":"619cd21e-c5e1-4c89-9907-242d8e394477","Type":"ContainerStarted","Data":"261b3165c64fa1595fc8e3937ed5c3df84d7b3d98ffe8bf90df879991c4c0533"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.732026 4694 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rzdk7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.732087 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" podUID="3d017650-73fd-4db1-958d-7bca865a125b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.738310 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" event={"ID":"04ac4a19-2aa4-44da-ac5d-4df6622094b2","Type":"ContainerStarted","Data":"8bed84bb390689d0473185d854ba1a7944b04c6ef7c1dd3da724bcceeba88939"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.759535 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" event={"ID":"333fd19f-3ad1-4f3d-98d5-5bca3ffb9e2a","Type":"ContainerStarted","Data":"e9c67833699a1f5fabcf35663f3bc1862292966a53d4849f69a91fc1f1d62ec9"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.774904 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" event={"ID":"dcfad47b-0808-41a1-aa1f-f23b6eb262bd","Type":"ContainerStarted","Data":"cab86d01c3fb00168df07ec67ab48d718b3387c36c0c93710ab89c79b2ed7535"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.809970 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" event={"ID":"5b4e3579-1839-4de9-8e52-71ca7976b353","Type":"ContainerStarted","Data":"ef7002ba218b3ba75e25ed226efdc2d8495111f513c18c0e953fb04e1d6034ce"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.810029 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" event={"ID":"5b4e3579-1839-4de9-8e52-71ca7976b353","Type":"ContainerStarted","Data":"d696f4ec64ef6860f2cd83278ac2fac24a7ad64ea9d7c9d183973d56a76343e6"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.810950 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.818509 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kclz7" podStartSLOduration=125.818489393 podStartE2EDuration="2m5.818489393s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.772428597 +0000 UTC m=+146.529503921" watchObservedRunningTime="2026-02-17 16:44:38.818489393 +0000 UTC m=+146.575564717" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.821372 4694 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pj4d2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.821444 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" podUID="5b4e3579-1839-4de9-8e52-71ca7976b353" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.822758 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.823418 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.323403717 +0000 UTC m=+147.080479041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.825070 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-b24qw" podStartSLOduration=126.825048189 podStartE2EDuration="2m6.825048189s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.822177296 +0000 UTC m=+146.579252630" watchObservedRunningTime="2026-02-17 16:44:38.825048189 +0000 UTC m=+146.582123513" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.828993 4694 generic.go:334] "Generic (PLEG): container finished" podID="67ab93c3-9ab4-409c-b349-9032ff88e45b" containerID="0980f71cad6b254abaead428e4ab86a758433e7ae127ed4a521a647e5d914c30" exitCode=0 Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.829071 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" event={"ID":"67ab93c3-9ab4-409c-b349-9032ff88e45b","Type":"ContainerDied","Data":"0980f71cad6b254abaead428e4ab86a758433e7ae127ed4a521a647e5d914c30"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.859982 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" event={"ID":"cbd476af-578e-47ea-bfae-4b4d1303106c","Type":"ContainerStarted","Data":"426fe63bb5a6f90aebc4ccc021212550520c7b2e55c18eca975e19d7d915bfba"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.860031 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" event={"ID":"cbd476af-578e-47ea-bfae-4b4d1303106c","Type":"ContainerStarted","Data":"8296c6f2ba654b40d7e98d3b447adcf01bb05e0d4af3b34ae0da526c03b3b92f"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.901492 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.901546 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.931299 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:38 crc kubenswrapper[4694]: E0217 16:44:38.934237 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.434221421 +0000 UTC m=+147.191296755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.956717 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" podStartSLOduration=125.956697339 podStartE2EDuration="2m5.956697339s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.89228813 +0000 UTC m=+146.649363454" watchObservedRunningTime="2026-02-17 16:44:38.956697339 +0000 UTC m=+146.713772663" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.956875 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" podStartSLOduration=126.956870114 podStartE2EDuration="2m6.956870114s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.955735165 +0000 UTC m=+146.712810499" watchObservedRunningTime="2026-02-17 16:44:38.956870114 +0000 UTC m=+146.713945438" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.965007 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" event={"ID":"9457bb31-c044-484a-bc14-44e93af90889","Type":"ContainerStarted","Data":"1ba767a1004652f6160d8794ccdc20b7de05a623e9dfd501934d01003d12cfac"} Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.965072 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.965096 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.998690 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:38 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:38 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:38 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:38 crc kubenswrapper[4694]: I0217 16:44:38.998770 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.029985 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-wq4dh" podStartSLOduration=127.029969593 podStartE2EDuration="2m7.029969593s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:38.98755539 +0000 UTC m=+146.744630714" watchObservedRunningTime="2026-02-17 16:44:39.029969593 +0000 UTC m=+146.787044917" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.030385 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l4wps" podStartSLOduration=127.030381173 podStartE2EDuration="2m7.030381173s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.027983673 +0000 UTC m=+146.785058997" watchObservedRunningTime="2026-02-17 16:44:39.030381173 +0000 UTC m=+146.787456497" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.032358 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.033775 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.533758379 +0000 UTC m=+147.290833703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.090528 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r8g5f" podStartSLOduration=7.090503544 podStartE2EDuration="7.090503544s" podCreationTimestamp="2026-02-17 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.078938942 +0000 UTC m=+146.836014256" watchObservedRunningTime="2026-02-17 16:44:39.090503544 +0000 UTC m=+146.847578868" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.142042 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.142441 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.642425158 +0000 UTC m=+147.399500482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.244005 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9fhv7" podStartSLOduration=127.243975227 podStartE2EDuration="2m7.243975227s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.242067809 +0000 UTC m=+146.999143133" watchObservedRunningTime="2026-02-17 16:44:39.243975227 +0000 UTC m=+147.001050551" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.247216 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.247545 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.747527837 +0000 UTC m=+147.504603161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.281940 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" podStartSLOduration=126.281921077 podStartE2EDuration="2m6.281921077s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.281114897 +0000 UTC m=+147.038190221" watchObservedRunningTime="2026-02-17 16:44:39.281921077 +0000 UTC m=+147.038996401" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.335135 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" podStartSLOduration=126.335102582 podStartE2EDuration="2m6.335102582s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.335055951 +0000 UTC m=+147.092131285" watchObservedRunningTime="2026-02-17 16:44:39.335102582 +0000 UTC m=+147.092177906" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.349315 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.349723 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.849707392 +0000 UTC m=+147.606782716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.352878 4694 csr.go:261] certificate signing request csr-vb8q6 is approved, waiting to be issued Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.376004 4694 csr.go:257] certificate signing request csr-vb8q6 is issued Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.434816 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqffd" podStartSLOduration=127.434797705 podStartE2EDuration="2m7.434797705s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.432422595 +0000 UTC m=+147.189497919" watchObservedRunningTime="2026-02-17 16:44:39.434797705 +0000 UTC m=+147.191873019" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.461076 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.461383 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:39.961369197 +0000 UTC m=+147.718444521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.562217 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.562549 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.062538116 +0000 UTC m=+147.819613440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.652087 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-96t75" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.663643 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.663740 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.163714466 +0000 UTC m=+147.920789790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.663970 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.664251 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.16424357 +0000 UTC m=+147.921318894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.765316 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.765446 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.265421789 +0000 UTC m=+148.022497113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.765522 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.765823 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.265814629 +0000 UTC m=+148.022889953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.866566 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.866740 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.366716772 +0000 UTC m=+148.123792096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.866985 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.867275 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.367264306 +0000 UTC m=+148.124339630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.903806 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" event={"ID":"d3ed4fee-4f78-4018-9a59-d8a98da1659f","Type":"ContainerStarted","Data":"ddd492c9b3c4c091a225acbca6d55ddee34d669aeb1846d424395d98084878fd"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.903886 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" event={"ID":"d3ed4fee-4f78-4018-9a59-d8a98da1659f","Type":"ContainerStarted","Data":"b01b3c56694728e50a71c0943e71ab48dc61747555548fbb81c631d99b6c5d54"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.905482 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" event={"ID":"66b75582-c991-495b-aa57-74aaac8319e2","Type":"ContainerStarted","Data":"3bcbeff4141fda9ada4f1bbe9986b985be995bef2f8542021e7207d0b77db435"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.905539 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" event={"ID":"66b75582-c991-495b-aa57-74aaac8319e2","Type":"ContainerStarted","Data":"1c00c7d11b1502d787f31115b0d6d713fad0d1333db94d06b7c17bba97763ee6"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.907435 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq4xh" event={"ID":"70100d5b-32f2-496c-af2f-f08ce53c9148","Type":"ContainerStarted","Data":"e70332ebd25bf4a94677e1add9fa9a89b41e4f72ed3c92d0999402d6929a40aa"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.907496 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq4xh" event={"ID":"70100d5b-32f2-496c-af2f-f08ce53c9148","Type":"ContainerStarted","Data":"c171131be9c24d84c0d84bc265592fc944fd1867823dad05d7a2de9213fecc9b"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.907540 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.913007 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" event={"ID":"9457bb31-c044-484a-bc14-44e93af90889","Type":"ContainerStarted","Data":"931a1b6eda9a491caa29d50e68dc29959a4598b96828a7343ddc630b47d7816b"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.913045 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6swq2" event={"ID":"9457bb31-c044-484a-bc14-44e93af90889","Type":"ContainerStarted","Data":"0f0e455e397b04911ab1360d9ce87da574cdcfd7055ae09beed4287df5e4d689"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.915431 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" event={"ID":"a5b422d1-061f-4d69-9ed9-247f4930fe99","Type":"ContainerStarted","Data":"9b45fb464e59a5f699c4da9be7ba6e1bde91c029be7bb9a723b812ce580f3e05"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.915478 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" event={"ID":"a5b422d1-061f-4d69-9ed9-247f4930fe99","Type":"ContainerStarted","Data":"e85f96a3e8829286c4f3973cb5a11280405d3ad4e2916a2ec3d3c47363c7fdc8"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.917465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" event={"ID":"a8ed54f8-a641-4418-bf3f-59d8cb2ab7da","Type":"ContainerStarted","Data":"a676bb0c726b9f4af93ea1c45e77d33685c1246cf27a4927b99f3ff332bb074e"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.919420 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" event={"ID":"67ab93c3-9ab4-409c-b349-9032ff88e45b","Type":"ContainerStarted","Data":"61e62739dc1770bccd80151e616f9ef323b42bc2bf86a20682975630ea74dfd0"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.919563 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.921254 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" event={"ID":"5ca423ac-de33-427a-b561-f04e6631b6d8","Type":"ContainerStarted","Data":"aa1edf4379783df53f7e35b184b9f937a64c02af7dc3bb9ac033d2656548a1e5"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.921378 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.922519 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" event={"ID":"348e6db7-381a-4772-abbf-812a3b883c17","Type":"ContainerStarted","Data":"d7d143db32defbb2554db83de6d17bfeff1cd6b0ef5816cae470420b60c5d5b1"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.924200 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jltpb" event={"ID":"81c1264d-0ebb-41e1-aacd-b68532d19b93","Type":"ContainerStarted","Data":"67ff75020af5247fdaf803134eb58fef68216db1c4444371e132e0eda5be8f9f"} Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.925085 4694 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8q2l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.925137 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.943308 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pj4d2" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.957234 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-26p7p" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.967968 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.968071 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.468053376 +0000 UTC m=+148.225128700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.968317 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:39 crc kubenswrapper[4694]: E0217 16:44:39.968620 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.468600719 +0000 UTC m=+148.225676043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.985887 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" podStartSLOduration=127.985872966 podStartE2EDuration="2m7.985872966s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:39.969293657 +0000 UTC m=+147.726368981" watchObservedRunningTime="2026-02-17 16:44:39.985872966 +0000 UTC m=+147.742948280" Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.990128 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:39 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:39 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:39 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:39 crc kubenswrapper[4694]: I0217 16:44:39.990173 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.047195 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-85fqn" podStartSLOduration=127.047143656 podStartE2EDuration="2m7.047143656s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.014690495 +0000 UTC m=+147.771765819" watchObservedRunningTime="2026-02-17 16:44:40.047143656 +0000 UTC m=+147.804218970" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.069672 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.071093 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.571077712 +0000 UTC m=+148.328153036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.115761 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qn6jr" podStartSLOduration=128.115741002 podStartE2EDuration="2m8.115741002s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.053962419 +0000 UTC m=+147.811037743" watchObservedRunningTime="2026-02-17 16:44:40.115741002 +0000 UTC m=+147.872816326" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.167641 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.171847 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.172294 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.672282342 +0000 UTC m=+148.429357666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.216038 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nq4xh" podStartSLOduration=8.216020388 podStartE2EDuration="8.216020388s" podCreationTimestamp="2026-02-17 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.127097219 +0000 UTC m=+147.884172543" watchObservedRunningTime="2026-02-17 16:44:40.216020388 +0000 UTC m=+147.973095712" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.216979 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" podStartSLOduration=128.216973962 podStartE2EDuration="2m8.216973962s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.212437317 +0000 UTC m=+147.969512641" watchObservedRunningTime="2026-02-17 16:44:40.216973962 +0000 UTC m=+147.974049286" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.272747 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.272913 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.772879156 +0000 UTC m=+148.529954480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.272976 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.273288 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.773276877 +0000 UTC m=+148.530352201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.311764 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" podStartSLOduration=127.31174209 podStartE2EDuration="2m7.31174209s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.311426472 +0000 UTC m=+148.068501796" watchObservedRunningTime="2026-02-17 16:44:40.31174209 +0000 UTC m=+148.068817414" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.374573 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.374778 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.874747174 +0000 UTC m=+148.631822518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.375032 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.375381 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.875366149 +0000 UTC m=+148.632441473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.377292 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 16:39:39 +0000 UTC, rotation deadline is 2026-12-01 05:31:37.90446851 +0000 UTC Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.377331 4694 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6876h46m57.527140391s for next certificate rotation Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.432838 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" podStartSLOduration=127.432820593 podStartE2EDuration="2m7.432820593s" podCreationTimestamp="2026-02-17 16:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:40.432559226 +0000 UTC m=+148.189634550" watchObservedRunningTime="2026-02-17 16:44:40.432820593 +0000 UTC m=+148.189895917" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.475937 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.476112 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.976087567 +0000 UTC m=+148.733162881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.476225 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.476589 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:40.97657519 +0000 UTC m=+148.733650514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.577256 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.577431 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.077405891 +0000 UTC m=+148.834481215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.577517 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.577842 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.077834001 +0000 UTC m=+148.834909325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.678933 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.679050 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.179032352 +0000 UTC m=+148.936107676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.679127 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.679455 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.179445742 +0000 UTC m=+148.936521066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.780684 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.780884 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.780937 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.781101 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.281074393 +0000 UTC m=+149.038149717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.796597 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.797596 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.882308 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.882369 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.882453 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.882729 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.382716365 +0000 UTC m=+149.139791689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.889187 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.890249 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.919097 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.930988 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.934238 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.974146 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" event={"ID":"348e6db7-381a-4772-abbf-812a3b883c17","Type":"ContainerStarted","Data":"d41162fd6e782726a6175e262c45bfaf7981006d74abc9f37fd001fa937886d4"} Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.976432 4694 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8q2l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.976501 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 17 16:44:40 crc kubenswrapper[4694]: I0217 16:44:40.996325 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:40 crc kubenswrapper[4694]: E0217 16:44:40.996752 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.496738499 +0000 UTC m=+149.253813823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.001881 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:41 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:41 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:41 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.001926 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.101268 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.102876 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.103787 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.105648 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.605633234 +0000 UTC m=+149.362708638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.107645 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.139438 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.206096 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.206499 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.706480576 +0000 UTC m=+149.463555900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.287646 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308640 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrhd\" (UniqueName: \"kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308683 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308718 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308752 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308774 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308801 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.308843 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72ks\" (UniqueName: \"kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.312031 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.315098 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.320186 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.820169592 +0000 UTC m=+149.577244916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.324034 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.410272 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.410737 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.410748 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.910720513 +0000 UTC m=+149.667795837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.410815 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.410871 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.410929 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b72ks\" (UniqueName: \"kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.411000 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrhd\" (UniqueName: \"kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.411019 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.411052 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.411358 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:41.911346919 +0000 UTC m=+149.668422243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.411962 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.412144 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.412354 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.412545 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.454107 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72ks\" (UniqueName: \"kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks\") pod \"community-operators-rmjbf\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.486046 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrhd\" (UniqueName: \"kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd\") pod \"certified-operators-gp2g7\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.491591 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.493995 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.513148 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.513891 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.013875842 +0000 UTC m=+149.770951166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.517286 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.614475 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.614536 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.614593 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjdc\" (UniqueName: \"kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.614642 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.615407 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.115392651 +0000 UTC m=+149.872467975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: W0217 16:44:41.660933 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0639052c48644a10c2aa19f9fb886c8e84d8fe417001136d98a174aaae7ec7e1 WatchSource:0}: Error finding container 0639052c48644a10c2aa19f9fb886c8e84d8fe417001136d98a174aaae7ec7e1: Status 404 returned error can't find the container with id 0639052c48644a10c2aa19f9fb886c8e84d8fe417001136d98a174aaae7ec7e1 Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.672016 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.698955 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.703309 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716568 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716691 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjdc\" (UniqueName: \"kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716719 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716741 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716777 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtt8\" (UniqueName: \"kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716795 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.716824 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.717212 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.717277 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.217261898 +0000 UTC m=+149.974337222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.717713 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.723218 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.725756 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.796706 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjdc\" (UniqueName: \"kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc\") pod \"certified-operators-kdxh6\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.819157 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.819206 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtt8\" (UniqueName: \"kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.819223 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.819257 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.819511 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.319500105 +0000 UTC m=+150.076575429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.820586 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.823170 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.857239 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.863088 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtt8\" (UniqueName: \"kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8\") pod \"community-operators-mb42m\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:41 crc kubenswrapper[4694]: W0217 16:44:41.908410 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-964d4c97e42dc1bd2b5d525ded7088c84878afc7c9dcf176230b19f0859596f9 WatchSource:0}: Error finding container 964d4c97e42dc1bd2b5d525ded7088c84878afc7c9dcf176230b19f0859596f9: Status 404 returned error can't find the container with id 964d4c97e42dc1bd2b5d525ded7088c84878afc7c9dcf176230b19f0859596f9 Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.920839 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.921073 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.421021763 +0000 UTC m=+150.178097087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.921163 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:41 crc kubenswrapper[4694]: E0217 16:44:41.921444 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.421432263 +0000 UTC m=+150.178507587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6d7zh" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.968499 4694 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.989859 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:41 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:41 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:41 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.990110 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:41 crc kubenswrapper[4694]: I0217 16:44:41.999112 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"964d4c97e42dc1bd2b5d525ded7088c84878afc7c9dcf176230b19f0859596f9"} Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.000747 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b317cc58cff8bc2718f797698ca2832826e3d608ae07c90aa72ff5e8e769e6eb"} Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.011439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" event={"ID":"348e6db7-381a-4772-abbf-812a3b883c17","Type":"ContainerStarted","Data":"dbcbbb2a2fbdfda02214aa1f03e8a5e908cbde3beb7b5e0080c560335dfac9b3"} Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.019357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0639052c48644a10c2aa19f9fb886c8e84d8fe417001136d98a174aaae7ec7e1"} Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.025110 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:42 crc kubenswrapper[4694]: E0217 16:44:42.026328 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 16:44:42.526313027 +0000 UTC m=+150.283388351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.046247 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" podStartSLOduration=10.046226291 podStartE2EDuration="10.046226291s" podCreationTimestamp="2026-02-17 16:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:42.043099801 +0000 UTC m=+149.800175125" watchObservedRunningTime="2026-02-17 16:44:42.046226291 +0000 UTC m=+149.803301615" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.107304 4694 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T16:44:41.968536385Z","Handler":null,"Name":""} Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.118678 4694 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.118710 4694 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.127874 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.128197 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.131624 4694 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.131684 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.169022 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.204317 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6d7zh\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.230173 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.269448 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.272815 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.343063 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.347279 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:42 crc kubenswrapper[4694]: W0217 16:44:42.353839 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ad90cd_78de_4743_b788_a02aca87e94a.slice/crio-e888087c88bd7701753415ba453c3ddad5ef7f2f9a76ad911fada0c821c809bc WatchSource:0}: Error finding container e888087c88bd7701753415ba453c3ddad5ef7f2f9a76ad911fada0c821c809bc: Status 404 returned error can't find the container with id e888087c88bd7701753415ba453c3ddad5ef7f2f9a76ad911fada0c821c809bc Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.590004 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:44:42 crc kubenswrapper[4694]: W0217 16:44:42.605011 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7441c9b_9c03_4267_a0da_376c7d4bcf66.slice/crio-eb74afa3a99ded0840b97a8b12cd3dd3449b122dfc8adf04a28b0fb40796fd71 WatchSource:0}: Error finding container eb74afa3a99ded0840b97a8b12cd3dd3449b122dfc8adf04a28b0fb40796fd71: Status 404 returned error can't find the container with id eb74afa3a99ded0840b97a8b12cd3dd3449b122dfc8adf04a28b0fb40796fd71 Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.675151 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.902645 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.988359 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:42 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:42 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:42 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:42 crc kubenswrapper[4694]: I0217 16:44:42.989045 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.025459 4694 generic.go:334] "Generic (PLEG): container finished" podID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerID="2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e" exitCode=0 Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.025558 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerDied","Data":"2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.025587 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerStarted","Data":"7e7a624116248b88f06db40151e24eb96e33f825e578b426938f5b7077733c56"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.027130 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3f2bde792359f61c493ddf0760cb37ac102af8467a374e64068829546ee7e06a"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.027267 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.027726 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.029703 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"69ccb9d2af9993222de170eddeda7592244883b934dab83c370687a442a6103f"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.030715 4694 generic.go:334] "Generic (PLEG): container finished" podID="33ad90cd-78de-4743-b788-a02aca87e94a" containerID="6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f" exitCode=0 Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.030793 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerDied","Data":"6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.030822 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerStarted","Data":"e888087c88bd7701753415ba453c3ddad5ef7f2f9a76ad911fada0c821c809bc"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.034271 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gn9k9" event={"ID":"348e6db7-381a-4772-abbf-812a3b883c17","Type":"ContainerStarted","Data":"4a7d76310a73567977fceae62dad3e3c39cadca097e6254a18cf5a00b634224a"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.037362 4694 generic.go:334] "Generic (PLEG): container finished" podID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerID="551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26" exitCode=0 Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.037439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerDied","Data":"551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.037465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerStarted","Data":"eb74afa3a99ded0840b97a8b12cd3dd3449b122dfc8adf04a28b0fb40796fd71"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.039299 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" event={"ID":"a3388f49-0f84-40f2-8030-a6f508979e71","Type":"ContainerStarted","Data":"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.039362 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" event={"ID":"a3388f49-0f84-40f2-8030-a6f508979e71","Type":"ContainerStarted","Data":"f2e9d6ec322f83cee67142578ace1145088e9adcb69d57566c632f7ea2fe38af"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.039434 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.042152 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"066d670559decef486eb55c1e3daa88382445b4ed23179023a6abe8ad7722110"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.046363 4694 generic.go:334] "Generic (PLEG): container finished" podID="04ac4a19-2aa4-44da-ac5d-4df6622094b2" containerID="8bed84bb390689d0473185d854ba1a7944b04c6ef7c1dd3da724bcceeba88939" exitCode=0 Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.046451 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" event={"ID":"04ac4a19-2aa4-44da-ac5d-4df6622094b2","Type":"ContainerDied","Data":"8bed84bb390689d0473185d854ba1a7944b04c6ef7c1dd3da724bcceeba88939"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.049227 4694 generic.go:334] "Generic (PLEG): container finished" podID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerID="2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c" exitCode=0 Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.049271 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerDied","Data":"2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.049295 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerStarted","Data":"0edb71418bfca3ec158f3eb0b03d5c1e63917a21058205ba15df3f70e441bcfd"} Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.180037 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" podStartSLOduration=131.180016314 podStartE2EDuration="2m11.180016314s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:43.177560732 +0000 UTC m=+150.934636056" watchObservedRunningTime="2026-02-17 16:44:43.180016314 +0000 UTC m=+150.937091638" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.283515 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.284525 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.286451 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.296344 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.448265 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.448320 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.448451 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g7vx\" (UniqueName: \"kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.549163 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.549246 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.549322 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g7vx\" (UniqueName: \"kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.549981 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.550194 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.568535 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g7vx\" (UniqueName: \"kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx\") pod \"redhat-marketplace-tx6l2\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.601233 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.689558 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.690791 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.705051 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.852886 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwqv\" (UniqueName: \"kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.852964 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.852994 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.871438 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.954177 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.954556 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.954642 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwqv\" (UniqueName: \"kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.955470 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.955542 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.987517 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwqv\" (UniqueName: \"kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv\") pod \"redhat-marketplace-mhc82\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.993778 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:43 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:43 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:43 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:43 crc kubenswrapper[4694]: I0217 16:44:43.993835 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.009807 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.019388 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mtmz6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.091698 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerStarted","Data":"ff6b22c4977dc85ec19235e04dd66f15a6411b4bd67624e678865f431eab57d9"} Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.285114 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.286657 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.292549 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.348576 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.462349 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.462623 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.462724 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzbw\" (UniqueName: \"kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.481333 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.481395 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.481841 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.481893 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.489353 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.489418 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.491320 4694 patch_prober.go:28] interesting pod/console-f9d7485db-896vh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.491365 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-896vh" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.523061 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.539269 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.539332 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.552906 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.564843 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.564955 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.564982 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzbw\" (UniqueName: \"kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.567895 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.567908 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.588332 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzbw\" (UniqueName: \"kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw\") pod \"redhat-operators-xsqk6\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.617910 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.617964 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.626915 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.656825 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.666041 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume\") pod \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.666129 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rtn\" (UniqueName: \"kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn\") pod \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.666172 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume\") pod \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\" (UID: \"04ac4a19-2aa4-44da-ac5d-4df6622094b2\") " Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.677846 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "04ac4a19-2aa4-44da-ac5d-4df6622094b2" (UID: "04ac4a19-2aa4-44da-ac5d-4df6622094b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.678408 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04ac4a19-2aa4-44da-ac5d-4df6622094b2" (UID: "04ac4a19-2aa4-44da-ac5d-4df6622094b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.694080 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn" (OuterVolumeSpecName: "kube-api-access-22rtn") pod "04ac4a19-2aa4-44da-ac5d-4df6622094b2" (UID: "04ac4a19-2aa4-44da-ac5d-4df6622094b2"). InnerVolumeSpecName "kube-api-access-22rtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.696114 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:44:44 crc kubenswrapper[4694]: E0217 16:44:44.696754 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ac4a19-2aa4-44da-ac5d-4df6622094b2" containerName="collect-profiles" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.696844 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ac4a19-2aa4-44da-ac5d-4df6622094b2" containerName="collect-profiles" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.697157 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ac4a19-2aa4-44da-ac5d-4df6622094b2" containerName="collect-profiles" Feb 17 16:44:44 crc kubenswrapper[4694]: W0217 16:44:44.697164 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf570cc19_8dbd_49a9_a576_e86967c85dc4.slice/crio-bc3ea9f24853ef4d7d9dc00cf13db84f529ae7cbc21906017f338c3b53cd9b94 WatchSource:0}: Error finding container bc3ea9f24853ef4d7d9dc00cf13db84f529ae7cbc21906017f338c3b53cd9b94: Status 404 returned error can't find the container with id bc3ea9f24853ef4d7d9dc00cf13db84f529ae7cbc21906017f338c3b53cd9b94 Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.698914 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.701852 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.767756 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04ac4a19-2aa4-44da-ac5d-4df6622094b2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.767805 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rtn\" (UniqueName: \"kubernetes.io/projected/04ac4a19-2aa4-44da-ac5d-4df6622094b2-kube-api-access-22rtn\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.767814 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04ac4a19-2aa4-44da-ac5d-4df6622094b2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.869416 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4j9p\" (UniqueName: \"kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.869473 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.869580 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.875395 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.875447 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.892212 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.946165 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.971248 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4j9p\" (UniqueName: \"kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.971305 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.971425 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.971915 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.971965 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.985073 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.987930 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4j9p\" (UniqueName: \"kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p\") pod \"redhat-operators-8bncf\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.988657 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:44 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:44 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:44 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:44 crc kubenswrapper[4694]: I0217 16:44:44.988701 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:45 crc kubenswrapper[4694]: W0217 16:44:45.032968 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2002375_3db0_44d4_8c8d_e945a20a38d9.slice/crio-53a9252944df2a649d7fa594975084f97637978bbedaf6352738b6eb540b8505 WatchSource:0}: Error finding container 53a9252944df2a649d7fa594975084f97637978bbedaf6352738b6eb540b8505: Status 404 returned error can't find the container with id 53a9252944df2a649d7fa594975084f97637978bbedaf6352738b6eb540b8505 Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.036451 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.094737 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerStarted","Data":"53a9252944df2a649d7fa594975084f97637978bbedaf6352738b6eb540b8505"} Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.098413 4694 generic.go:334] "Generic (PLEG): container finished" podID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerID="e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5" exitCode=0 Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.098501 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerDied","Data":"e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5"} Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.098638 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerStarted","Data":"bc3ea9f24853ef4d7d9dc00cf13db84f529ae7cbc21906017f338c3b53cd9b94"} Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.114897 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" event={"ID":"04ac4a19-2aa4-44da-ac5d-4df6622094b2","Type":"ContainerDied","Data":"ab77d87d46992a632a5fae8f51b9d869f8e1278bbd12efbe9a1dd9d064c125f3"} Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.114927 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.114941 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab77d87d46992a632a5fae8f51b9d869f8e1278bbd12efbe9a1dd9d064c125f3" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.117284 4694 generic.go:334] "Generic (PLEG): container finished" podID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerID="a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b" exitCode=0 Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.117374 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerDied","Data":"a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b"} Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.121137 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wqqd4" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.126958 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-99c97" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.251998 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.439381 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.448135 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.448962 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.450917 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.451199 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.461465 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 16:44:45 crc kubenswrapper[4694]: W0217 16:44:45.486112 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb86392_9eb2_4486_9480_cbf6ad26a5c5.slice/crio-34ce3816f1d15906b05c28e3ce2969291340c4faccee7b273f2bac4f12a27d7d WatchSource:0}: Error finding container 34ce3816f1d15906b05c28e3ce2969291340c4faccee7b273f2bac4f12a27d7d: Status 404 returned error can't find the container with id 34ce3816f1d15906b05c28e3ce2969291340c4faccee7b273f2bac4f12a27d7d Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.551160 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.552285 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.554269 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.557219 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.560131 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.593363 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.593450 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.694257 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.694367 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.694402 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.694451 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.694520 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.731105 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.772005 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.795859 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.795970 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.796051 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.818777 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.870285 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.995439 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.997180 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:45 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:45 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:45 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:45 crc kubenswrapper[4694]: I0217 16:44:45.997231 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:46 crc kubenswrapper[4694]: W0217 16:44:46.015638 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podefdb711c_8f69_4479_bde3_a98695ad60a5.slice/crio-92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8 WatchSource:0}: Error finding container 92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8: Status 404 returned error can't find the container with id 92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8 Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.143177 4694 generic.go:334] "Generic (PLEG): container finished" podID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerID="b41f89d658c4823eeada89a3c733c52a0a0348b598ec2305b0b2384c4d26a33f" exitCode=0 Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.143246 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerDied","Data":"b41f89d658c4823eeada89a3c733c52a0a0348b598ec2305b0b2384c4d26a33f"} Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.143272 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerStarted","Data":"34ce3816f1d15906b05c28e3ce2969291340c4faccee7b273f2bac4f12a27d7d"} Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.144746 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerID="db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64" exitCode=0 Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.144803 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerDied","Data":"db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64"} Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.147297 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"efdb711c-8f69-4479-bde3-a98695ad60a5","Type":"ContainerStarted","Data":"92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8"} Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.400691 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.988368 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:46 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:46 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:46 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:46 crc kubenswrapper[4694]: I0217 16:44:46.988509 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.159110 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2af7839c-9e94-4093-8b19-6c590dd9e3cf","Type":"ContainerStarted","Data":"8d3c69937d775d6086e857422bc4d5c84cdc298d4225db74c146be53fcbb3665"} Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.159465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2af7839c-9e94-4093-8b19-6c590dd9e3cf","Type":"ContainerStarted","Data":"9acc02ca37ccbd661bfd9cc5e209d8baf0112a0deb3c28132795af142dbe8f6d"} Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.163824 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"efdb711c-8f69-4479-bde3-a98695ad60a5","Type":"ContainerStarted","Data":"56a9d530d20d41d9b3a8a7e7bc57f66b6fb6847e693f956dcf8c5deb9aa4cabf"} Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.178490 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.178475111 podStartE2EDuration="2.178475111s" podCreationTimestamp="2026-02-17 16:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:47.172647844 +0000 UTC m=+154.929723168" watchObservedRunningTime="2026-02-17 16:44:47.178475111 +0000 UTC m=+154.935550435" Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.191170 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.191153632 podStartE2EDuration="2.191153632s" podCreationTimestamp="2026-02-17 16:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:44:47.188262989 +0000 UTC m=+154.945338333" watchObservedRunningTime="2026-02-17 16:44:47.191153632 +0000 UTC m=+154.948228956" Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.987712 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:47 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:47 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:47 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:47 crc kubenswrapper[4694]: I0217 16:44:47.987762 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.175288 4694 generic.go:334] "Generic (PLEG): container finished" podID="efdb711c-8f69-4479-bde3-a98695ad60a5" containerID="56a9d530d20d41d9b3a8a7e7bc57f66b6fb6847e693f956dcf8c5deb9aa4cabf" exitCode=0 Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.175350 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"efdb711c-8f69-4479-bde3-a98695ad60a5","Type":"ContainerDied","Data":"56a9d530d20d41d9b3a8a7e7bc57f66b6fb6847e693f956dcf8c5deb9aa4cabf"} Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.178955 4694 generic.go:334] "Generic (PLEG): container finished" podID="2af7839c-9e94-4093-8b19-6c590dd9e3cf" containerID="8d3c69937d775d6086e857422bc4d5c84cdc298d4225db74c146be53fcbb3665" exitCode=0 Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.178981 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2af7839c-9e94-4093-8b19-6c590dd9e3cf","Type":"ContainerDied","Data":"8d3c69937d775d6086e857422bc4d5c84cdc298d4225db74c146be53fcbb3665"} Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.987807 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:48 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:48 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:48 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:48 crc kubenswrapper[4694]: I0217 16:44:48.987887 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:49 crc kubenswrapper[4694]: I0217 16:44:49.988730 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:49 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:49 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:49 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:49 crc kubenswrapper[4694]: I0217 16:44:49.989058 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:50 crc kubenswrapper[4694]: I0217 16:44:50.639153 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nq4xh" Feb 17 16:44:50 crc kubenswrapper[4694]: I0217 16:44:50.988775 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:50 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:50 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:50 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:50 crc kubenswrapper[4694]: I0217 16:44:50.989196 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:51 crc kubenswrapper[4694]: I0217 16:44:51.988089 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:51 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:51 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:51 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:51 crc kubenswrapper[4694]: I0217 16:44:51.988165 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:52 crc kubenswrapper[4694]: I0217 16:44:52.987151 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:52 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:52 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:52 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:52 crc kubenswrapper[4694]: I0217 16:44:52.987219 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:53.988720 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:54 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:54 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:54 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.022117 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.487255 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.487584 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.487716 4694 patch_prober.go:28] interesting pod/downloads-7954f5f757-nsrtk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.487740 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nsrtk" podUID="f2bdc667-8b8d-48de-8f28-f3bf6ef1f3e2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.490296 4694 patch_prober.go:28] interesting pod/console-f9d7485db-896vh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.490332 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-896vh" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.988208 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:54 crc kubenswrapper[4694]: [-]has-synced failed: reason withheld Feb 17 16:44:54 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:54 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:54 crc kubenswrapper[4694]: I0217 16:44:54.988261 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:55 crc kubenswrapper[4694]: I0217 16:44:55.167837 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:55 crc kubenswrapper[4694]: I0217 16:44:55.174053 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/974057b2-a009-4d99-8bad-e50b651c8c3c-metrics-certs\") pod \"network-metrics-daemon-4qb4m\" (UID: \"974057b2-a009-4d99-8bad-e50b651c8c3c\") " pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:55 crc kubenswrapper[4694]: I0217 16:44:55.408705 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qb4m" Feb 17 16:44:55 crc kubenswrapper[4694]: I0217 16:44:55.988256 4694 patch_prober.go:28] interesting pod/router-default-5444994796-t7nr4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 16:44:55 crc kubenswrapper[4694]: [+]has-synced ok Feb 17 16:44:55 crc kubenswrapper[4694]: [+]process-running ok Feb 17 16:44:55 crc kubenswrapper[4694]: healthz check failed Feb 17 16:44:55 crc kubenswrapper[4694]: I0217 16:44:55.988537 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t7nr4" podUID="c4aa91f8-086e-415b-aadc-da13d3d90ae9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 16:44:56 crc kubenswrapper[4694]: I0217 16:44:56.987426 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:56 crc kubenswrapper[4694]: I0217 16:44:56.990304 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t7nr4" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.232758 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.238328 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.257811 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"efdb711c-8f69-4479-bde3-a98695ad60a5","Type":"ContainerDied","Data":"92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8"} Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.257891 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92dc78d142508e9fe751b2183eca5882bf7a81f89c9975720863fc7d27c17ba8" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.257995 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.262465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2af7839c-9e94-4093-8b19-6c590dd9e3cf","Type":"ContainerDied","Data":"9acc02ca37ccbd661bfd9cc5e209d8baf0112a0deb3c28132795af142dbe8f6d"} Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.262760 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.262909 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9acc02ca37ccbd661bfd9cc5e209d8baf0112a0deb3c28132795af142dbe8f6d" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408316 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir\") pod \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408406 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2af7839c-9e94-4093-8b19-6c590dd9e3cf" (UID: "2af7839c-9e94-4093-8b19-6c590dd9e3cf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408433 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access\") pod \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\" (UID: \"2af7839c-9e94-4093-8b19-6c590dd9e3cf\") " Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408466 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir\") pod \"efdb711c-8f69-4479-bde3-a98695ad60a5\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408525 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access\") pod \"efdb711c-8f69-4479-bde3-a98695ad60a5\" (UID: \"efdb711c-8f69-4479-bde3-a98695ad60a5\") " Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408596 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "efdb711c-8f69-4479-bde3-a98695ad60a5" (UID: "efdb711c-8f69-4479-bde3-a98695ad60a5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408716 4694 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.408732 4694 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/efdb711c-8f69-4479-bde3-a98695ad60a5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.416096 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "efdb711c-8f69-4479-bde3-a98695ad60a5" (UID: "efdb711c-8f69-4479-bde3-a98695ad60a5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.423696 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2af7839c-9e94-4093-8b19-6c590dd9e3cf" (UID: "2af7839c-9e94-4093-8b19-6c590dd9e3cf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.509660 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2af7839c-9e94-4093-8b19-6c590dd9e3cf-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:44:58 crc kubenswrapper[4694]: I0217 16:44:58.509687 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efdb711c-8f69-4479-bde3-a98695ad60a5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.132570 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb"] Feb 17 16:45:00 crc kubenswrapper[4694]: E0217 16:45:00.132841 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efdb711c-8f69-4479-bde3-a98695ad60a5" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.132858 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="efdb711c-8f69-4479-bde3-a98695ad60a5" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: E0217 16:45:00.132877 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af7839c-9e94-4093-8b19-6c590dd9e3cf" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.132885 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af7839c-9e94-4093-8b19-6c590dd9e3cf" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.133013 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af7839c-9e94-4093-8b19-6c590dd9e3cf" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.133029 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="efdb711c-8f69-4479-bde3-a98695ad60a5" containerName="pruner" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.133523 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.135347 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.135849 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.138830 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb"] Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.234983 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.235068 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntlg8\" (UniqueName: \"kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.235141 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.336301 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntlg8\" (UniqueName: \"kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.336345 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.336378 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.352939 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.356716 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.359434 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntlg8\" (UniqueName: \"kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8\") pod \"collect-profiles-29522445-plgcb\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:00 crc kubenswrapper[4694]: I0217 16:45:00.454986 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:02 crc kubenswrapper[4694]: I0217 16:45:02.355455 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:45:04 crc kubenswrapper[4694]: I0217 16:45:04.485443 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nsrtk" Feb 17 16:45:04 crc kubenswrapper[4694]: I0217 16:45:04.493897 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:45:04 crc kubenswrapper[4694]: I0217 16:45:04.502370 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:45:14 crc kubenswrapper[4694]: I0217 16:45:14.617643 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:45:14 crc kubenswrapper[4694]: I0217 16:45:14.618145 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:45:14 crc kubenswrapper[4694]: E0217 16:45:14.946269 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 16:45:14 crc kubenswrapper[4694]: E0217 16:45:14.946425 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b72ks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rmjbf_openshift-marketplace(0f68e586-955c-4c2c-8b3e-a91f6b95a442): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:14 crc kubenswrapper[4694]: E0217 16:45:14.947675 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rmjbf" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" Feb 17 16:45:15 crc kubenswrapper[4694]: E0217 16:45:15.001757 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 16:45:15 crc kubenswrapper[4694]: E0217 16:45:15.001921 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbtt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mb42m_openshift-marketplace(d7441c9b-9c03-4267-a0da-376c7d4bcf66): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:15 crc kubenswrapper[4694]: E0217 16:45:15.003106 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mb42m" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" Feb 17 16:45:15 crc kubenswrapper[4694]: I0217 16:45:15.211348 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-27mc4" Feb 17 16:45:16 crc kubenswrapper[4694]: E0217 16:45:16.731915 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 16:45:16 crc kubenswrapper[4694]: E0217 16:45:16.732275 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snwqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mhc82_openshift-marketplace(f570cc19-8dbd-49a9-a576-e86967c85dc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:16 crc kubenswrapper[4694]: E0217 16:45:16.733509 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mhc82" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" Feb 17 16:45:17 crc kubenswrapper[4694]: E0217 16:45:17.923127 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mhc82" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" Feb 17 16:45:17 crc kubenswrapper[4694]: E0217 16:45:17.923232 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mb42m" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" Feb 17 16:45:17 crc kubenswrapper[4694]: E0217 16:45:17.924647 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rmjbf" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" Feb 17 16:45:18 crc kubenswrapper[4694]: E0217 16:45:18.172685 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 16:45:18 crc kubenswrapper[4694]: E0217 16:45:18.172843 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsrhd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gp2g7_openshift-marketplace(c7a9bea3-8150-4246-9c2b-dd9d57e17f30): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:18 crc kubenswrapper[4694]: E0217 16:45:18.174202 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gp2g7" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" Feb 17 16:45:20 crc kubenswrapper[4694]: I0217 16:45:20.946743 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 16:45:21 crc kubenswrapper[4694]: E0217 16:45:21.656654 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gp2g7" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" Feb 17 16:45:21 crc kubenswrapper[4694]: E0217 16:45:21.907130 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 16:45:21 crc kubenswrapper[4694]: E0217 16:45:21.907699 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mtzbw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xsqk6_openshift-marketplace(f2002375-3db0-44d4-8c8d-e945a20a38d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:21 crc kubenswrapper[4694]: E0217 16:45:21.909082 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xsqk6" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" Feb 17 16:45:22 crc kubenswrapper[4694]: E0217 16:45:22.414538 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xsqk6" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.456350 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.459108 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.461740 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.461959 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.465757 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.561421 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.561478 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.662590 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.662873 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.664602 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.685892 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: E0217 16:45:22.722159 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 17 16:45:22 crc kubenswrapper[4694]: E0217 16:45:22.722308 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tjdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kdxh6_openshift-marketplace(33ad90cd-78de-4743-b788-a02aca87e94a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 16:45:22 crc kubenswrapper[4694]: E0217 16:45:22.723708 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kdxh6" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.791797 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:22 crc kubenswrapper[4694]: I0217 16:45:22.816837 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4qb4m"] Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.027349 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb"] Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.285906 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 16:45:23 crc kubenswrapper[4694]: W0217 16:45:23.293528 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod99303287_b64c_49d1_a2a5_40ffaa3432b4.slice/crio-13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1 WatchSource:0}: Error finding container 13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1: Status 404 returned error can't find the container with id 13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1 Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.387807 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerStarted","Data":"de277e08e183faa00f4c29541265728a0456fda29e48492b46666d29ca17826e"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.389485 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" event={"ID":"974057b2-a009-4d99-8bad-e50b651c8c3c","Type":"ContainerStarted","Data":"65e0a88346a916b549cce88d2c555234dd285d14e4b87e95c7b5da6d65a2e4a8"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.389530 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" event={"ID":"974057b2-a009-4d99-8bad-e50b651c8c3c","Type":"ContainerStarted","Data":"47724113608aaf286ab681521fe46d79731c008bbf10d41ec93dcb9c8dc44c82"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.390517 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99303287-b64c-49d1-a2a5-40ffaa3432b4","Type":"ContainerStarted","Data":"13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.391579 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" event={"ID":"50b786d2-03c3-4db5-b5b2-f8f79e30efd4","Type":"ContainerStarted","Data":"f934634bcbf6bba16f0d59ee82f75e360f2f7669864dfdac04fdb2e3f0ff9546"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.391636 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" event={"ID":"50b786d2-03c3-4db5-b5b2-f8f79e30efd4","Type":"ContainerStarted","Data":"ed0b266d030823c49152a9eaec0df47e2b331bcd2ce19feb16a88f4c3ad0882d"} Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.393714 4694 generic.go:334] "Generic (PLEG): container finished" podID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerID="122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c" exitCode=0 Feb 17 16:45:23 crc kubenswrapper[4694]: I0217 16:45:23.393804 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerDied","Data":"122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c"} Feb 17 16:45:23 crc kubenswrapper[4694]: E0217 16:45:23.395679 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kdxh6" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.400175 4694 generic.go:334] "Generic (PLEG): container finished" podID="50b786d2-03c3-4db5-b5b2-f8f79e30efd4" containerID="f934634bcbf6bba16f0d59ee82f75e360f2f7669864dfdac04fdb2e3f0ff9546" exitCode=0 Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.400559 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" event={"ID":"50b786d2-03c3-4db5-b5b2-f8f79e30efd4","Type":"ContainerDied","Data":"f934634bcbf6bba16f0d59ee82f75e360f2f7669864dfdac04fdb2e3f0ff9546"} Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.405223 4694 generic.go:334] "Generic (PLEG): container finished" podID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerID="de277e08e183faa00f4c29541265728a0456fda29e48492b46666d29ca17826e" exitCode=0 Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.405348 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerDied","Data":"de277e08e183faa00f4c29541265728a0456fda29e48492b46666d29ca17826e"} Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.410571 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qb4m" event={"ID":"974057b2-a009-4d99-8bad-e50b651c8c3c","Type":"ContainerStarted","Data":"49612f0e778e46bb6ad2846c5c4371487fcb59ac7c90aec27e6ad1cacce24008"} Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.417249 4694 generic.go:334] "Generic (PLEG): container finished" podID="99303287-b64c-49d1-a2a5-40ffaa3432b4" containerID="ef6baec158a1f681242c6c63eeafea0d3a0ca7aa261168bfb664ef525f87a994" exitCode=0 Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.417326 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99303287-b64c-49d1-a2a5-40ffaa3432b4","Type":"ContainerDied","Data":"ef6baec158a1f681242c6c63eeafea0d3a0ca7aa261168bfb664ef525f87a994"} Feb 17 16:45:24 crc kubenswrapper[4694]: E0217 16:45:24.418785 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod99303287_b64c_49d1_a2a5_40ffaa3432b4.slice/crio-conmon-ef6baec158a1f681242c6c63eeafea0d3a0ca7aa261168bfb664ef525f87a994.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod99303287_b64c_49d1_a2a5_40ffaa3432b4.slice/crio-ef6baec158a1f681242c6c63eeafea0d3a0ca7aa261168bfb664ef525f87a994.scope\": RecentStats: unable to find data in memory cache]" Feb 17 16:45:24 crc kubenswrapper[4694]: I0217 16:45:24.434339 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4qb4m" podStartSLOduration=172.434319002 podStartE2EDuration="2m52.434319002s" podCreationTimestamp="2026-02-17 16:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:45:24.433014749 +0000 UTC m=+192.190090073" watchObservedRunningTime="2026-02-17 16:45:24.434319002 +0000 UTC m=+192.191394326" Feb 17 16:45:25 crc kubenswrapper[4694]: I0217 16:45:25.826576 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:25 crc kubenswrapper[4694]: I0217 16:45:25.857859 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.010061 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume\") pod \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.010124 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access\") pod \"99303287-b64c-49d1-a2a5-40ffaa3432b4\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.010147 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntlg8\" (UniqueName: \"kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8\") pod \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.010188 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir\") pod \"99303287-b64c-49d1-a2a5-40ffaa3432b4\" (UID: \"99303287-b64c-49d1-a2a5-40ffaa3432b4\") " Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.010245 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume\") pod \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\" (UID: \"50b786d2-03c3-4db5-b5b2-f8f79e30efd4\") " Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.011662 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume" (OuterVolumeSpecName: "config-volume") pod "50b786d2-03c3-4db5-b5b2-f8f79e30efd4" (UID: "50b786d2-03c3-4db5-b5b2-f8f79e30efd4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.012366 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "99303287-b64c-49d1-a2a5-40ffaa3432b4" (UID: "99303287-b64c-49d1-a2a5-40ffaa3432b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.017583 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "50b786d2-03c3-4db5-b5b2-f8f79e30efd4" (UID: "50b786d2-03c3-4db5-b5b2-f8f79e30efd4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.017741 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "99303287-b64c-49d1-a2a5-40ffaa3432b4" (UID: "99303287-b64c-49d1-a2a5-40ffaa3432b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.026949 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8" (OuterVolumeSpecName: "kube-api-access-ntlg8") pod "50b786d2-03c3-4db5-b5b2-f8f79e30efd4" (UID: "50b786d2-03c3-4db5-b5b2-f8f79e30efd4"). InnerVolumeSpecName "kube-api-access-ntlg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.112357 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.112420 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.112433 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99303287-b64c-49d1-a2a5-40ffaa3432b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.112445 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntlg8\" (UniqueName: \"kubernetes.io/projected/50b786d2-03c3-4db5-b5b2-f8f79e30efd4-kube-api-access-ntlg8\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.112461 4694 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99303287-b64c-49d1-a2a5-40ffaa3432b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.437124 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerStarted","Data":"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9"} Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.441241 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99303287-b64c-49d1-a2a5-40ffaa3432b4","Type":"ContainerDied","Data":"13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1"} Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.441282 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b1214aaffba011609721a513311fc38a6314452708ff79f14f4b0fced2e2f1" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.441311 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.442528 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" event={"ID":"50b786d2-03c3-4db5-b5b2-f8f79e30efd4","Type":"ContainerDied","Data":"ed0b266d030823c49152a9eaec0df47e2b331bcd2ce19feb16a88f4c3ad0882d"} Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.442571 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed0b266d030823c49152a9eaec0df47e2b331bcd2ce19feb16a88f4c3ad0882d" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.442594 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb" Feb 17 16:45:26 crc kubenswrapper[4694]: I0217 16:45:26.462264 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tx6l2" podStartSLOduration=3.286372982 podStartE2EDuration="43.462247057s" podCreationTimestamp="2026-02-17 16:44:43 +0000 UTC" firstStartedPulling="2026-02-17 16:44:45.142674657 +0000 UTC m=+152.899749981" lastFinishedPulling="2026-02-17 16:45:25.318548732 +0000 UTC m=+193.075624056" observedRunningTime="2026-02-17 16:45:26.457663661 +0000 UTC m=+194.214738985" watchObservedRunningTime="2026-02-17 16:45:26.462247057 +0000 UTC m=+194.219322381" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.445013 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 16:45:27 crc kubenswrapper[4694]: E0217 16:45:27.445533 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99303287-b64c-49d1-a2a5-40ffaa3432b4" containerName="pruner" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.445547 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="99303287-b64c-49d1-a2a5-40ffaa3432b4" containerName="pruner" Feb 17 16:45:27 crc kubenswrapper[4694]: E0217 16:45:27.445566 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b786d2-03c3-4db5-b5b2-f8f79e30efd4" containerName="collect-profiles" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.445575 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b786d2-03c3-4db5-b5b2-f8f79e30efd4" containerName="collect-profiles" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.445711 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b786d2-03c3-4db5-b5b2-f8f79e30efd4" containerName="collect-profiles" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.445730 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="99303287-b64c-49d1-a2a5-40ffaa3432b4" containerName="pruner" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.446144 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.452098 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.452356 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.456567 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.473312 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerStarted","Data":"12cf9888a019a584b87520003f07e948bfdd33571bc4262b0c5ceef1e6e790ed"} Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.632734 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.632818 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.632855 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.733905 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.733967 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.734010 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.734095 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.734474 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.760443 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:27 crc kubenswrapper[4694]: I0217 16:45:27.777978 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:45:28 crc kubenswrapper[4694]: I0217 16:45:28.205028 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8bncf" podStartSLOduration=3.899186474 podStartE2EDuration="44.205006906s" podCreationTimestamp="2026-02-17 16:44:44 +0000 UTC" firstStartedPulling="2026-02-17 16:44:46.145653482 +0000 UTC m=+153.902728806" lastFinishedPulling="2026-02-17 16:45:26.451473914 +0000 UTC m=+194.208549238" observedRunningTime="2026-02-17 16:45:27.500434521 +0000 UTC m=+195.257509845" watchObservedRunningTime="2026-02-17 16:45:28.205006906 +0000 UTC m=+195.962082220" Feb 17 16:45:28 crc kubenswrapper[4694]: I0217 16:45:28.205778 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 16:45:28 crc kubenswrapper[4694]: W0217 16:45:28.221953 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf69753e1_8094_4124_8ce3_7978d53239f6.slice/crio-623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339 WatchSource:0}: Error finding container 623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339: Status 404 returned error can't find the container with id 623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339 Feb 17 16:45:28 crc kubenswrapper[4694]: I0217 16:45:28.479453 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f69753e1-8094-4124-8ce3-7978d53239f6","Type":"ContainerStarted","Data":"623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339"} Feb 17 16:45:29 crc kubenswrapper[4694]: I0217 16:45:29.485679 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f69753e1-8094-4124-8ce3-7978d53239f6","Type":"ContainerStarted","Data":"ea1d908109f4f5937f509c78d64e2c8786bc1e29b50ba61e9d35077084853cac"} Feb 17 16:45:30 crc kubenswrapper[4694]: I0217 16:45:30.507153 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.507133017 podStartE2EDuration="3.507133017s" podCreationTimestamp="2026-02-17 16:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:45:30.503528576 +0000 UTC m=+198.260603900" watchObservedRunningTime="2026-02-17 16:45:30.507133017 +0000 UTC m=+198.264208341" Feb 17 16:45:32 crc kubenswrapper[4694]: I0217 16:45:32.504512 4694 generic.go:334] "Generic (PLEG): container finished" podID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerID="15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501" exitCode=0 Feb 17 16:45:32 crc kubenswrapper[4694]: I0217 16:45:32.504620 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerDied","Data":"15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501"} Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.513312 4694 generic.go:334] "Generic (PLEG): container finished" podID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerID="50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a" exitCode=0 Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.513383 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerDied","Data":"50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a"} Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.518950 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerStarted","Data":"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32"} Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.562988 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhc82" podStartSLOduration=2.591991365 podStartE2EDuration="50.562966937s" podCreationTimestamp="2026-02-17 16:44:43 +0000 UTC" firstStartedPulling="2026-02-17 16:44:45.100190503 +0000 UTC m=+152.857265827" lastFinishedPulling="2026-02-17 16:45:33.071166075 +0000 UTC m=+200.828241399" observedRunningTime="2026-02-17 16:45:33.561651554 +0000 UTC m=+201.318726878" watchObservedRunningTime="2026-02-17 16:45:33.562966937 +0000 UTC m=+201.320042261" Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.601674 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:45:33 crc kubenswrapper[4694]: I0217 16:45:33.601724 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:45:34 crc kubenswrapper[4694]: I0217 16:45:34.010998 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:34 crc kubenswrapper[4694]: I0217 16:45:34.011050 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:34 crc kubenswrapper[4694]: I0217 16:45:34.303268 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:45:34 crc kubenswrapper[4694]: I0217 16:45:34.575144 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.037818 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.038180 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.080773 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.280553 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mhc82" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="registry-server" probeResult="failure" output=< Feb 17 16:45:35 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 16:45:35 crc kubenswrapper[4694]: > Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.540171 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerStarted","Data":"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411"} Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.541832 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerID="d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78" exitCode=0 Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.541895 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerDied","Data":"d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78"} Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.544544 4694 generic.go:334] "Generic (PLEG): container finished" podID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerID="c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5" exitCode=0 Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.544769 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerDied","Data":"c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5"} Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.566316 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mb42m" podStartSLOduration=2.595873745 podStartE2EDuration="54.566297888s" podCreationTimestamp="2026-02-17 16:44:41 +0000 UTC" firstStartedPulling="2026-02-17 16:44:43.038406232 +0000 UTC m=+150.795481556" lastFinishedPulling="2026-02-17 16:45:35.008830375 +0000 UTC m=+202.765905699" observedRunningTime="2026-02-17 16:45:35.561077036 +0000 UTC m=+203.318152360" watchObservedRunningTime="2026-02-17 16:45:35.566297888 +0000 UTC m=+203.323373212" Feb 17 16:45:35 crc kubenswrapper[4694]: I0217 16:45:35.586364 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:36 crc kubenswrapper[4694]: I0217 16:45:36.551860 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerStarted","Data":"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54"} Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.571268 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerStarted","Data":"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5"} Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.573812 4694 generic.go:334] "Generic (PLEG): container finished" podID="33ad90cd-78de-4743-b788-a02aca87e94a" containerID="22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54" exitCode=0 Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.573868 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerDied","Data":"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54"} Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.578503 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerStarted","Data":"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa"} Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.582976 4694 generic.go:334] "Generic (PLEG): container finished" podID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerID="fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913" exitCode=0 Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.583010 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerDied","Data":"fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913"} Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.589449 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rmjbf" podStartSLOduration=2.44468759 podStartE2EDuration="56.589435501s" podCreationTimestamp="2026-02-17 16:44:41 +0000 UTC" firstStartedPulling="2026-02-17 16:44:43.050339954 +0000 UTC m=+150.807415278" lastFinishedPulling="2026-02-17 16:45:37.195087865 +0000 UTC m=+204.952163189" observedRunningTime="2026-02-17 16:45:37.586961488 +0000 UTC m=+205.344036812" watchObservedRunningTime="2026-02-17 16:45:37.589435501 +0000 UTC m=+205.346510825" Feb 17 16:45:37 crc kubenswrapper[4694]: I0217 16:45:37.646112 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xsqk6" podStartSLOduration=2.564910578 podStartE2EDuration="53.646092264s" podCreationTimestamp="2026-02-17 16:44:44 +0000 UTC" firstStartedPulling="2026-02-17 16:44:46.146041702 +0000 UTC m=+153.903117026" lastFinishedPulling="2026-02-17 16:45:37.227223388 +0000 UTC m=+204.984298712" observedRunningTime="2026-02-17 16:45:37.642738309 +0000 UTC m=+205.399813623" watchObservedRunningTime="2026-02-17 16:45:37.646092264 +0000 UTC m=+205.403167588" Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.531726 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.532441 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8bncf" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="registry-server" containerID="cri-o://12cf9888a019a584b87520003f07e948bfdd33571bc4262b0c5ceef1e6e790ed" gracePeriod=2 Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.590088 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerStarted","Data":"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea"} Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.592848 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerStarted","Data":"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4"} Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.609050 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdxh6" podStartSLOduration=2.46285238 podStartE2EDuration="57.609033776s" podCreationTimestamp="2026-02-17 16:44:41 +0000 UTC" firstStartedPulling="2026-02-17 16:44:43.032554354 +0000 UTC m=+150.789629678" lastFinishedPulling="2026-02-17 16:45:38.17873575 +0000 UTC m=+205.935811074" observedRunningTime="2026-02-17 16:45:38.606898282 +0000 UTC m=+206.363973626" watchObservedRunningTime="2026-02-17 16:45:38.609033776 +0000 UTC m=+206.366109100" Feb 17 16:45:38 crc kubenswrapper[4694]: I0217 16:45:38.632051 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gp2g7" podStartSLOduration=2.698805009 podStartE2EDuration="57.632034488s" podCreationTimestamp="2026-02-17 16:44:41 +0000 UTC" firstStartedPulling="2026-02-17 16:44:43.027126266 +0000 UTC m=+150.784201590" lastFinishedPulling="2026-02-17 16:45:37.960355745 +0000 UTC m=+205.717431069" observedRunningTime="2026-02-17 16:45:38.631601417 +0000 UTC m=+206.388676751" watchObservedRunningTime="2026-02-17 16:45:38.632034488 +0000 UTC m=+206.389109812" Feb 17 16:45:39 crc kubenswrapper[4694]: I0217 16:45:39.599700 4694 generic.go:334] "Generic (PLEG): container finished" podID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerID="12cf9888a019a584b87520003f07e948bfdd33571bc4262b0c5ceef1e6e790ed" exitCode=0 Feb 17 16:45:39 crc kubenswrapper[4694]: I0217 16:45:39.599746 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerDied","Data":"12cf9888a019a584b87520003f07e948bfdd33571bc4262b0c5ceef1e6e790ed"} Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.126512 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.194600 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities\") pod \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.194666 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4j9p\" (UniqueName: \"kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p\") pod \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.194785 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content\") pod \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\" (UID: \"cbb86392-9eb2-4486-9480-cbf6ad26a5c5\") " Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.195363 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities" (OuterVolumeSpecName: "utilities") pod "cbb86392-9eb2-4486-9480-cbf6ad26a5c5" (UID: "cbb86392-9eb2-4486-9480-cbf6ad26a5c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.200789 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p" (OuterVolumeSpecName: "kube-api-access-w4j9p") pod "cbb86392-9eb2-4486-9480-cbf6ad26a5c5" (UID: "cbb86392-9eb2-4486-9480-cbf6ad26a5c5"). InnerVolumeSpecName "kube-api-access-w4j9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.296312 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.303120 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4j9p\" (UniqueName: \"kubernetes.io/projected/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-kube-api-access-w4j9p\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.342686 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbb86392-9eb2-4486-9480-cbf6ad26a5c5" (UID: "cbb86392-9eb2-4486-9480-cbf6ad26a5c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.404355 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb86392-9eb2-4486-9480-cbf6ad26a5c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.606341 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bncf" event={"ID":"cbb86392-9eb2-4486-9480-cbf6ad26a5c5","Type":"ContainerDied","Data":"34ce3816f1d15906b05c28e3ce2969291340c4faccee7b273f2bac4f12a27d7d"} Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.606398 4694 scope.go:117] "RemoveContainer" containerID="12cf9888a019a584b87520003f07e948bfdd33571bc4262b0c5ceef1e6e790ed" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.606541 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bncf" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.628885 4694 scope.go:117] "RemoveContainer" containerID="de277e08e183faa00f4c29541265728a0456fda29e48492b46666d29ca17826e" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.641376 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.646170 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8bncf"] Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.674893 4694 scope.go:117] "RemoveContainer" containerID="b41f89d658c4823eeada89a3c733c52a0a0348b598ec2305b0b2384c4d26a33f" Feb 17 16:45:40 crc kubenswrapper[4694]: I0217 16:45:40.903665 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" path="/var/lib/kubelet/pods/cbb86392-9eb2-4486-9480-cbf6ad26a5c5/volumes" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.673027 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.673748 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.720041 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.724003 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.724048 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.766406 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.862469 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.862773 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:41 crc kubenswrapper[4694]: I0217 16:45:41.907233 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:42 crc kubenswrapper[4694]: I0217 16:45:42.128531 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:42 crc kubenswrapper[4694]: I0217 16:45:42.128939 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:42 crc kubenswrapper[4694]: I0217 16:45:42.170547 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:42 crc kubenswrapper[4694]: I0217 16:45:42.654016 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:42 crc kubenswrapper[4694]: I0217 16:45:42.659891 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:45:43 crc kubenswrapper[4694]: I0217 16:45:43.654830 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:43 crc kubenswrapper[4694]: I0217 16:45:43.928303 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.051718 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.125821 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.617478 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.617836 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.617881 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.618422 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.618518 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6" gracePeriod=600 Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.627382 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.627417 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.627925 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mb42m" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="registry-server" containerID="cri-o://5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411" gracePeriod=2 Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.664438 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:45:44 crc kubenswrapper[4694]: E0217 16:45:44.748530 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e7e385_beb4_4e06_8718_fd68e90ba74e.slice/crio-8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7441c9b_9c03_4267_a0da_376c7d4bcf66.slice/crio-5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411.scope\": RecentStats: unable to find data in memory cache]" Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.929154 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:45:44 crc kubenswrapper[4694]: I0217 16:45:44.991764 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.165975 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content\") pod \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.166018 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtt8\" (UniqueName: \"kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8\") pod \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.166139 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities\") pod \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\" (UID: \"d7441c9b-9c03-4267-a0da-376c7d4bcf66\") " Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.167273 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities" (OuterVolumeSpecName: "utilities") pod "d7441c9b-9c03-4267-a0da-376c7d4bcf66" (UID: "d7441c9b-9c03-4267-a0da-376c7d4bcf66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.174879 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8" (OuterVolumeSpecName: "kube-api-access-mbtt8") pod "d7441c9b-9c03-4267-a0da-376c7d4bcf66" (UID: "d7441c9b-9c03-4267-a0da-376c7d4bcf66"). InnerVolumeSpecName "kube-api-access-mbtt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.220867 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7441c9b-9c03-4267-a0da-376c7d4bcf66" (UID: "d7441c9b-9c03-4267-a0da-376c7d4bcf66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.269623 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.269657 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7441c9b-9c03-4267-a0da-376c7d4bcf66-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.269668 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbtt8\" (UniqueName: \"kubernetes.io/projected/d7441c9b-9c03-4267-a0da-376c7d4bcf66-kube-api-access-mbtt8\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.633189 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6" exitCode=0 Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.633262 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6"} Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.633288 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8"} Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.635250 4694 generic.go:334] "Generic (PLEG): container finished" podID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerID="5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411" exitCode=0 Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.635382 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb42m" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.635529 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdxh6" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="registry-server" containerID="cri-o://c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea" gracePeriod=2 Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.635384 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerDied","Data":"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411"} Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.637298 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb42m" event={"ID":"d7441c9b-9c03-4267-a0da-376c7d4bcf66","Type":"ContainerDied","Data":"eb74afa3a99ded0840b97a8b12cd3dd3449b122dfc8adf04a28b0fb40796fd71"} Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.637414 4694 scope.go:117] "RemoveContainer" containerID="5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.660620 4694 scope.go:117] "RemoveContainer" containerID="50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.667767 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.670741 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mb42m"] Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.677558 4694 scope.go:117] "RemoveContainer" containerID="551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.688593 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.692297 4694 scope.go:117] "RemoveContainer" containerID="5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411" Feb 17 16:45:45 crc kubenswrapper[4694]: E0217 16:45:45.692851 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411\": container with ID starting with 5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411 not found: ID does not exist" containerID="5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.692962 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411"} err="failed to get container status \"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411\": rpc error: code = NotFound desc = could not find container \"5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411\": container with ID starting with 5fd81e40cdf4063b649a3d62d6f7316bec314e4e58b13a6851ad0ed40e1ba411 not found: ID does not exist" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.693061 4694 scope.go:117] "RemoveContainer" containerID="50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a" Feb 17 16:45:45 crc kubenswrapper[4694]: E0217 16:45:45.693455 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a\": container with ID starting with 50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a not found: ID does not exist" containerID="50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.693548 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a"} err="failed to get container status \"50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a\": rpc error: code = NotFound desc = could not find container \"50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a\": container with ID starting with 50c6ad3cbab79a69e799c05a63bd22b33aa875b2edd36ee437660823e461570a not found: ID does not exist" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.693631 4694 scope.go:117] "RemoveContainer" containerID="551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26" Feb 17 16:45:45 crc kubenswrapper[4694]: E0217 16:45:45.693917 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26\": container with ID starting with 551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26 not found: ID does not exist" containerID="551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.694012 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26"} err="failed to get container status \"551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26\": rpc error: code = NotFound desc = could not find container \"551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26\": container with ID starting with 551264c6b7ebbe7071efa93b59471405af6bf82560fb6e823dd5bd343e4edd26 not found: ID does not exist" Feb 17 16:45:45 crc kubenswrapper[4694]: I0217 16:45:45.983829 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.078205 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tjdc\" (UniqueName: \"kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc\") pod \"33ad90cd-78de-4743-b788-a02aca87e94a\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.078316 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities\") pod \"33ad90cd-78de-4743-b788-a02aca87e94a\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.078390 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content\") pod \"33ad90cd-78de-4743-b788-a02aca87e94a\" (UID: \"33ad90cd-78de-4743-b788-a02aca87e94a\") " Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.079184 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities" (OuterVolumeSpecName: "utilities") pod "33ad90cd-78de-4743-b788-a02aca87e94a" (UID: "33ad90cd-78de-4743-b788-a02aca87e94a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.081916 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc" (OuterVolumeSpecName: "kube-api-access-4tjdc") pod "33ad90cd-78de-4743-b788-a02aca87e94a" (UID: "33ad90cd-78de-4743-b788-a02aca87e94a"). InnerVolumeSpecName "kube-api-access-4tjdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.098290 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tjdc\" (UniqueName: \"kubernetes.io/projected/33ad90cd-78de-4743-b788-a02aca87e94a-kube-api-access-4tjdc\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.098326 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.136024 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33ad90cd-78de-4743-b788-a02aca87e94a" (UID: "33ad90cd-78de-4743-b788-a02aca87e94a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.200033 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ad90cd-78de-4743-b788-a02aca87e94a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.641985 4694 generic.go:334] "Generic (PLEG): container finished" podID="33ad90cd-78de-4743-b788-a02aca87e94a" containerID="c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea" exitCode=0 Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.642045 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdxh6" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.642074 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerDied","Data":"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea"} Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.642455 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdxh6" event={"ID":"33ad90cd-78de-4743-b788-a02aca87e94a","Type":"ContainerDied","Data":"e888087c88bd7701753415ba453c3ddad5ef7f2f9a76ad911fada0c821c809bc"} Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.642486 4694 scope.go:117] "RemoveContainer" containerID="c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.655865 4694 scope.go:117] "RemoveContainer" containerID="22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.668270 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.670661 4694 scope.go:117] "RemoveContainer" containerID="6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.677363 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdxh6"] Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.714447 4694 scope.go:117] "RemoveContainer" containerID="c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea" Feb 17 16:45:46 crc kubenswrapper[4694]: E0217 16:45:46.714939 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea\": container with ID starting with c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea not found: ID does not exist" containerID="c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.714971 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea"} err="failed to get container status \"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea\": rpc error: code = NotFound desc = could not find container \"c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea\": container with ID starting with c65ed230426a33458caece7bd497986026228717fd8207a256049554bebc6cea not found: ID does not exist" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.714996 4694 scope.go:117] "RemoveContainer" containerID="22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54" Feb 17 16:45:46 crc kubenswrapper[4694]: E0217 16:45:46.715293 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54\": container with ID starting with 22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54 not found: ID does not exist" containerID="22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.715315 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54"} err="failed to get container status \"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54\": rpc error: code = NotFound desc = could not find container \"22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54\": container with ID starting with 22aa8be7d9cd6575a9bcd9a45c144bbcfdbe24e6d160b84ff731568eadb9fd54 not found: ID does not exist" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.715328 4694 scope.go:117] "RemoveContainer" containerID="6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f" Feb 17 16:45:46 crc kubenswrapper[4694]: E0217 16:45:46.715631 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f\": container with ID starting with 6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f not found: ID does not exist" containerID="6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.715658 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f"} err="failed to get container status \"6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f\": rpc error: code = NotFound desc = could not find container \"6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f\": container with ID starting with 6e0446491322b5404ce78c5e5867f4acce916aeb537050054b637263ba1c3d3f not found: ID does not exist" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.902736 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" path="/var/lib/kubelet/pods/33ad90cd-78de-4743-b788-a02aca87e94a/volumes" Feb 17 16:45:46 crc kubenswrapper[4694]: I0217 16:45:46.903429 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" path="/var/lib/kubelet/pods/d7441c9b-9c03-4267-a0da-376c7d4bcf66/volumes" Feb 17 16:45:47 crc kubenswrapper[4694]: I0217 16:45:47.333533 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:45:47 crc kubenswrapper[4694]: I0217 16:45:47.333999 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mhc82" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="registry-server" containerID="cri-o://9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32" gracePeriod=2 Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.214946 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.326021 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content\") pod \"f570cc19-8dbd-49a9-a576-e86967c85dc4\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.326082 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snwqv\" (UniqueName: \"kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv\") pod \"f570cc19-8dbd-49a9-a576-e86967c85dc4\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.326194 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities\") pod \"f570cc19-8dbd-49a9-a576-e86967c85dc4\" (UID: \"f570cc19-8dbd-49a9-a576-e86967c85dc4\") " Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.326929 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities" (OuterVolumeSpecName: "utilities") pod "f570cc19-8dbd-49a9-a576-e86967c85dc4" (UID: "f570cc19-8dbd-49a9-a576-e86967c85dc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.330851 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv" (OuterVolumeSpecName: "kube-api-access-snwqv") pod "f570cc19-8dbd-49a9-a576-e86967c85dc4" (UID: "f570cc19-8dbd-49a9-a576-e86967c85dc4"). InnerVolumeSpecName "kube-api-access-snwqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.355481 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f570cc19-8dbd-49a9-a576-e86967c85dc4" (UID: "f570cc19-8dbd-49a9-a576-e86967c85dc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.428057 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.428097 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snwqv\" (UniqueName: \"kubernetes.io/projected/f570cc19-8dbd-49a9-a576-e86967c85dc4-kube-api-access-snwqv\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.428108 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f570cc19-8dbd-49a9-a576-e86967c85dc4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.656522 4694 generic.go:334] "Generic (PLEG): container finished" podID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerID="9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32" exitCode=0 Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.656558 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerDied","Data":"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32"} Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.656581 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhc82" event={"ID":"f570cc19-8dbd-49a9-a576-e86967c85dc4","Type":"ContainerDied","Data":"bc3ea9f24853ef4d7d9dc00cf13db84f529ae7cbc21906017f338c3b53cd9b94"} Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.656599 4694 scope.go:117] "RemoveContainer" containerID="9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.656594 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhc82" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.673569 4694 scope.go:117] "RemoveContainer" containerID="15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.688766 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.688831 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhc82"] Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.705939 4694 scope.go:117] "RemoveContainer" containerID="e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.720052 4694 scope.go:117] "RemoveContainer" containerID="9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32" Feb 17 16:45:48 crc kubenswrapper[4694]: E0217 16:45:48.720497 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32\": container with ID starting with 9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32 not found: ID does not exist" containerID="9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.720679 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32"} err="failed to get container status \"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32\": rpc error: code = NotFound desc = could not find container \"9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32\": container with ID starting with 9d7fe8f628053a4b5b30a443e63eedd8d22788f68c013eef753e1674c8940a32 not found: ID does not exist" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.720821 4694 scope.go:117] "RemoveContainer" containerID="15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501" Feb 17 16:45:48 crc kubenswrapper[4694]: E0217 16:45:48.721969 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501\": container with ID starting with 15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501 not found: ID does not exist" containerID="15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.722013 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501"} err="failed to get container status \"15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501\": rpc error: code = NotFound desc = could not find container \"15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501\": container with ID starting with 15a57bea7bd368b26e2eadef63f490aab4231c2756435c4a575dc820fea02501 not found: ID does not exist" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.722043 4694 scope.go:117] "RemoveContainer" containerID="e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5" Feb 17 16:45:48 crc kubenswrapper[4694]: E0217 16:45:48.722366 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5\": container with ID starting with e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5 not found: ID does not exist" containerID="e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.722509 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5"} err="failed to get container status \"e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5\": rpc error: code = NotFound desc = could not find container \"e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5\": container with ID starting with e4b7bfb250eceefb63bddec7f7ea2b90e1a642e8182f5f87bec89a24709040a5 not found: ID does not exist" Feb 17 16:45:48 crc kubenswrapper[4694]: I0217 16:45:48.902316 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" path="/var/lib/kubelet/pods/f570cc19-8dbd-49a9-a576-e86967c85dc4/volumes" Feb 17 16:45:51 crc kubenswrapper[4694]: I0217 16:45:51.762806 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:45:53 crc kubenswrapper[4694]: I0217 16:45:53.917427 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fmljb"] Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.702363 4694 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704147 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704242 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704314 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704371 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704432 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704563 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704644 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704713 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704779 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704837 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.704903 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.704976 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705037 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705094 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705156 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705213 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705272 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705328 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="extract-content" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705387 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705444 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705537 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705598 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.705675 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705735 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="extract-utilities" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705886 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7441c9b-9c03-4267-a0da-376c7d4bcf66" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.705951 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb86392-9eb2-4486-9480-cbf6ad26a5c5" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706016 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f570cc19-8dbd-49a9-a576-e86967c85dc4" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706086 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ad90cd-78de-4743-b788-a02aca87e94a" containerName="registry-server" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706454 4694 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706541 4694 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706585 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.706789 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706854 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.706917 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.706974 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.707037 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707097 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.707159 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707217 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.707295 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707355 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.707432 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707516 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707156 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125" gracePeriod=15 Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707124 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462" gracePeriod=15 Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707106 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd" gracePeriod=15 Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707073 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7" gracePeriod=15 Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.707580 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707900 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.707058 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141" gracePeriod=15 Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708111 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708131 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708141 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708152 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708161 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708171 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.708181 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 16:46:06 crc kubenswrapper[4694]: E0217 16:46:06.708592 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.712285 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.721146 4694 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.755592 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815209 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815256 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815273 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815286 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815449 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815490 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815546 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.815660 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.917381 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.918168 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.918347 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919223 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919690 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919893 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.920153 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.920365 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.917496 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919733 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.918423 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919970 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.918217 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.920209 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.919315 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:06 crc kubenswrapper[4694]: I0217 16:46:06.920396 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.051939 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:07 crc kubenswrapper[4694]: E0217 16:46:07.085753 4694 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895167fea23c457 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,LastTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.757418 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af"} Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.757500 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7fefd21ded2195b6da77e1ce2db13c902f967aa941ddbab4fbc06a32bb2fb2b4"} Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.758519 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.761886 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.764516 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.765861 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462" exitCode=0 Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.765897 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7" exitCode=0 Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.765913 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd" exitCode=0 Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.765927 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125" exitCode=2 Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.766040 4694 scope.go:117] "RemoveContainer" containerID="63a09dde7abcb58604b236b384b89f988d5360b91c39e471765bd884bc50cf22" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.771197 4694 generic.go:334] "Generic (PLEG): container finished" podID="f69753e1-8094-4124-8ce3-7978d53239f6" containerID="ea1d908109f4f5937f509c78d64e2c8786bc1e29b50ba61e9d35077084853cac" exitCode=0 Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.771296 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f69753e1-8094-4124-8ce3-7978d53239f6","Type":"ContainerDied","Data":"ea1d908109f4f5937f509c78d64e2c8786bc1e29b50ba61e9d35077084853cac"} Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.772543 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:07 crc kubenswrapper[4694]: I0217 16:46:07.773065 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:07 crc kubenswrapper[4694]: E0217 16:46:07.940425 4694 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" volumeName="registry-storage" Feb 17 16:46:08 crc kubenswrapper[4694]: I0217 16:46:08.847782 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.076988 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.077586 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.078085 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.078246 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.078425 4694 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.134708 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.135234 4694 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.135472 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.135783 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.250987 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251074 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir\") pod \"f69753e1-8094-4124-8ce3-7978d53239f6\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251143 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock\") pod \"f69753e1-8094-4124-8ce3-7978d53239f6\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251164 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251186 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251229 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251235 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f69753e1-8094-4124-8ce3-7978d53239f6" (UID: "f69753e1-8094-4124-8ce3-7978d53239f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251283 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock" (OuterVolumeSpecName: "var-lock") pod "f69753e1-8094-4124-8ce3-7978d53239f6" (UID: "f69753e1-8094-4124-8ce3-7978d53239f6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251312 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251365 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access\") pod \"f69753e1-8094-4124-8ce3-7978d53239f6\" (UID: \"f69753e1-8094-4124-8ce3-7978d53239f6\") " Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251360 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251851 4694 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251866 4694 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251876 4694 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251884 4694 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f69753e1-8094-4124-8ce3-7978d53239f6-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.251892 4694 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.259376 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f69753e1-8094-4124-8ce3-7978d53239f6" (UID: "f69753e1-8094-4124-8ce3-7978d53239f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.353436 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f69753e1-8094-4124-8ce3-7978d53239f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.860010 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f69753e1-8094-4124-8ce3-7978d53239f6","Type":"ContainerDied","Data":"623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339"} Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.860053 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623b5a48495018c7dbfd9719cb9acc035af8db2967e9c6ea200e697a9a428339" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.860048 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.862724 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.863666 4694 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141" exitCode=0 Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.863706 4694 scope.go:117] "RemoveContainer" containerID="1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.863869 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.881706 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.881965 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.882209 4694 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.882639 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.882895 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.883413 4694 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.887825 4694 scope.go:117] "RemoveContainer" containerID="b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.901261 4694 scope.go:117] "RemoveContainer" containerID="1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.919556 4694 scope.go:117] "RemoveContainer" containerID="99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.936647 4694 scope.go:117] "RemoveContainer" containerID="1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.952196 4694 scope.go:117] "RemoveContainer" containerID="83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.976068 4694 scope.go:117] "RemoveContainer" containerID="1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.976492 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\": container with ID starting with 1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462 not found: ID does not exist" containerID="1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.976547 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462"} err="failed to get container status \"1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\": rpc error: code = NotFound desc = could not find container \"1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462\": container with ID starting with 1d1176d19ead79f438e265e72a7d76ed934970775f7bfd6b43f48757eedfb462 not found: ID does not exist" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.976584 4694 scope.go:117] "RemoveContainer" containerID="b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.976994 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\": container with ID starting with b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7 not found: ID does not exist" containerID="b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.977081 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7"} err="failed to get container status \"b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\": rpc error: code = NotFound desc = could not find container \"b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7\": container with ID starting with b96f33c3182a03dc939a9cdb13daf42a8d9dda7c415b46aafc0d86018be387f7 not found: ID does not exist" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.977111 4694 scope.go:117] "RemoveContainer" containerID="1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.978234 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\": container with ID starting with 1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd not found: ID does not exist" containerID="1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.978278 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd"} err="failed to get container status \"1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\": rpc error: code = NotFound desc = could not find container \"1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd\": container with ID starting with 1f1b7eaa5fb416cd8ed24fa62203850b30a22aecdc5eff19523fbf54e64d5cdd not found: ID does not exist" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.978302 4694 scope.go:117] "RemoveContainer" containerID="99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.978704 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\": container with ID starting with 99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125 not found: ID does not exist" containerID="99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.978783 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125"} err="failed to get container status \"99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\": rpc error: code = NotFound desc = could not find container \"99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125\": container with ID starting with 99dcca82dca0ac4f75fd385c4264c93e96c6ef2589f28bbca0eaf044437c7125 not found: ID does not exist" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.978856 4694 scope.go:117] "RemoveContainer" containerID="1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.979264 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\": container with ID starting with 1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141 not found: ID does not exist" containerID="1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.979291 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141"} err="failed to get container status \"1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\": rpc error: code = NotFound desc = could not find container \"1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141\": container with ID starting with 1233bac59dbde3183c560d109e29822a5d13845c7b5b93235ca77ef6388d3141 not found: ID does not exist" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.979309 4694 scope.go:117] "RemoveContainer" containerID="83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9" Feb 17 16:46:09 crc kubenswrapper[4694]: E0217 16:46:09.979639 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\": container with ID starting with 83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9 not found: ID does not exist" containerID="83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9" Feb 17 16:46:09 crc kubenswrapper[4694]: I0217 16:46:09.979689 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9"} err="failed to get container status \"83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\": rpc error: code = NotFound desc = could not find container \"83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9\": container with ID starting with 83051ebaa49a7d5654d357aed608267c945503a50a2efb1b2f7887d099c958c9 not found: ID does not exist" Feb 17 16:46:10 crc kubenswrapper[4694]: E0217 16:46:10.864739 4694 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895167fea23c457 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,LastTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 16:46:10 crc kubenswrapper[4694]: I0217 16:46:10.905950 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.685524 4694 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.685938 4694 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.686652 4694 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.686905 4694 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.687183 4694 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: I0217 16:46:12.687225 4694 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.687668 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Feb 17 16:46:12 crc kubenswrapper[4694]: E0217 16:46:12.888439 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Feb 17 16:46:12 crc kubenswrapper[4694]: I0217 16:46:12.907013 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:12 crc kubenswrapper[4694]: I0217 16:46:12.907423 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:13 crc kubenswrapper[4694]: E0217 16:46:13.289522 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Feb 17 16:46:14 crc kubenswrapper[4694]: E0217 16:46:14.090794 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Feb 17 16:46:15 crc kubenswrapper[4694]: E0217 16:46:15.692373 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Feb 17 16:46:18 crc kubenswrapper[4694]: E0217 16:46:18.893796 4694 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="6.4s" Feb 17 16:46:18 crc kubenswrapper[4694]: I0217 16:46:18.954110 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerName="oauth-openshift" containerID="cri-o://11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52" gracePeriod=15 Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.344887 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.346013 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.346658 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.346842 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496669 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496760 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496798 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496839 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496872 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496904 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496937 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8rfg\" (UniqueName: \"kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.497464 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.497527 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.497556 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.497596 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.497654 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.498067 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.496856 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.498449 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.498764 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.498896 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.498929 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session\") pod \"552639c4-d873-44a5-bbf1-0ada555d4d92\" (UID: \"552639c4-d873-44a5-bbf1-0ada555d4d92\") " Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.499408 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.499425 4694 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.499446 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.499458 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.500084 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.503113 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.503638 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.504532 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.504945 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.505022 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.505328 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.505806 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.507398 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg" (OuterVolumeSpecName: "kube-api-access-n8rfg") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "kube-api-access-n8rfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.508958 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "552639c4-d873-44a5-bbf1-0ada555d4d92" (UID: "552639c4-d873-44a5-bbf1-0ada555d4d92"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600346 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600391 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600407 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600422 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8rfg\" (UniqueName: \"kubernetes.io/projected/552639c4-d873-44a5-bbf1-0ada555d4d92-kube-api-access-n8rfg\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600435 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600445 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600456 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600828 4694 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/552639c4-d873-44a5-bbf1-0ada555d4d92-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600871 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.600889 4694 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/552639c4-d873-44a5-bbf1-0ada555d4d92-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.918215 4694 generic.go:334] "Generic (PLEG): container finished" podID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerID="11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52" exitCode=0 Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.918284 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" event={"ID":"552639c4-d873-44a5-bbf1-0ada555d4d92","Type":"ContainerDied","Data":"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52"} Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.918559 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" event={"ID":"552639c4-d873-44a5-bbf1-0ada555d4d92","Type":"ContainerDied","Data":"7315024aa3f3f636da74751676e8a8ec75f1b741724931a56e3a50f95916afda"} Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.918348 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.918578 4694 scope.go:117] "RemoveContainer" containerID="11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.919300 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.919489 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.919772 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.940566 4694 scope.go:117] "RemoveContainer" containerID="11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.941015 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: E0217 16:46:19.941210 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52\": container with ID starting with 11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52 not found: ID does not exist" containerID="11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.941257 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52"} err="failed to get container status \"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52\": rpc error: code = NotFound desc = could not find container \"11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52\": container with ID starting with 11cc020364326a3380ca85c8780472a95d5d518f16e03b56abfe53b5f94d9b52 not found: ID does not exist" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.941455 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:19 crc kubenswrapper[4694]: I0217 16:46:19.941725 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.467007 4694 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:50396->192.168.126.11:10257: read: connection reset by peer" start-of-body= Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.467079 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:50396->192.168.126.11:10257: read: connection reset by peer" Feb 17 16:46:20 crc kubenswrapper[4694]: E0217 16:46:20.866590 4694 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895167fea23c457 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,LastTimestamp:2026-02-17 16:46:07.085028439 +0000 UTC m=+234.842103803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.926919 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.926971 4694 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31" exitCode=1 Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.927058 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31"} Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.928851 4694 scope.go:117] "RemoveContainer" containerID="0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.928865 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.929304 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.929565 4694 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:20 crc kubenswrapper[4694]: I0217 16:46:20.929932 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.894726 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.895574 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.896065 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.896389 4694 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.896745 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.907264 4694 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.907301 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:21 crc kubenswrapper[4694]: E0217 16:46:21.907767 4694 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.908197 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:21 crc kubenswrapper[4694]: W0217 16:46:21.924979 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-d66ab0608eebe59e8a4f27a36d134bcfd6c8f332bc54526d04a62ae7bee01f42 WatchSource:0}: Error finding container d66ab0608eebe59e8a4f27a36d134bcfd6c8f332bc54526d04a62ae7bee01f42: Status 404 returned error can't find the container with id d66ab0608eebe59e8a4f27a36d134bcfd6c8f332bc54526d04a62ae7bee01f42 Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.936969 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d66ab0608eebe59e8a4f27a36d134bcfd6c8f332bc54526d04a62ae7bee01f42"} Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.940123 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.940177 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"267c0ecf492ed07987863d1462fc4a8cdc69e0019d02ab555f4797b78cc94704"} Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.941030 4694 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.941427 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.941918 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:21 crc kubenswrapper[4694]: I0217 16:46:21.942212 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.902460 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.903574 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.904232 4694 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.904682 4694 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.905141 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.947853 4694 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d7ff97e4e9d597d8f17ddf5317c8c49cf96bab6f773caea9899410fce2ac7235" exitCode=0 Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.947905 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d7ff97e4e9d597d8f17ddf5317c8c49cf96bab6f773caea9899410fce2ac7235"} Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.948188 4694 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.948208 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:22 crc kubenswrapper[4694]: E0217 16:46:22.948573 4694 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.948576 4694 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.949395 4694 status_manager.go:851] "Failed to get status for pod" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.949981 4694 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.950356 4694 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:22 crc kubenswrapper[4694]: I0217 16:46:22.950886 4694 status_manager.go:851] "Failed to get status for pod" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" pod="openshift-authentication/oauth-openshift-558db77b4-fmljb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fmljb\": dial tcp 38.102.83.75:6443: connect: connection refused" Feb 17 16:46:23 crc kubenswrapper[4694]: I0217 16:46:23.973461 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5dcab86b4a960ea4732772b931649738b869fa9009b87c407a47a23963d7042a"} Feb 17 16:46:23 crc kubenswrapper[4694]: I0217 16:46:23.973815 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02565bacf249eb80fd0faec27efb0aab90ba3a9ff1b5970035c879a06b283101"} Feb 17 16:46:23 crc kubenswrapper[4694]: I0217 16:46:23.973826 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a3b90fe1eee3797822b6b96fddc771c8e9d3eb1f1f30f9d76f27e0d71daf02e7"} Feb 17 16:46:23 crc kubenswrapper[4694]: I0217 16:46:23.973834 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4db3c3ecb7924d334795669efbe46a94e759a637edd8f74aafaffabfdfb9237e"} Feb 17 16:46:24 crc kubenswrapper[4694]: I0217 16:46:24.982489 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e06fc21b67bd5d028492b302005ec1fa0019a0e9167eadf5c3fc09b224aca743"} Feb 17 16:46:24 crc kubenswrapper[4694]: I0217 16:46:24.982649 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:24 crc kubenswrapper[4694]: I0217 16:46:24.982732 4694 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:24 crc kubenswrapper[4694]: I0217 16:46:24.982756 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:26 crc kubenswrapper[4694]: I0217 16:46:26.908768 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:26 crc kubenswrapper[4694]: I0217 16:46:26.909075 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:26 crc kubenswrapper[4694]: I0217 16:46:26.913675 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:29 crc kubenswrapper[4694]: I0217 16:46:29.991339 4694 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.012398 4694 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.012430 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.015390 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.040823 4694 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="be53535b-6489-4e54-8790-0ed212b0b96a" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.463477 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.633752 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.633891 4694 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 16:46:30 crc kubenswrapper[4694]: I0217 16:46:30.634185 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 16:46:31 crc kubenswrapper[4694]: I0217 16:46:31.017592 4694 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:31 crc kubenswrapper[4694]: I0217 16:46:31.017638 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e152d0e3-8cc4-49c4-adeb-fa8710dbcf34" Feb 17 16:46:31 crc kubenswrapper[4694]: I0217 16:46:31.020743 4694 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="be53535b-6489-4e54-8790-0ed212b0b96a" Feb 17 16:46:39 crc kubenswrapper[4694]: I0217 16:46:39.485034 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.488564 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.518998 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.583267 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.634690 4694 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.634739 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.720186 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.802695 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.863973 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.948871 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.972522 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.989040 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 16:46:40 crc kubenswrapper[4694]: I0217 16:46:40.999649 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.105657 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.258098 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.442885 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.443521 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.451071 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.728154 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.757562 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.811022 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.929387 4694 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.932984 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=35.932965578 podStartE2EDuration="35.932965578s" podCreationTimestamp="2026-02-17 16:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:46:30.007684668 +0000 UTC m=+257.764760012" watchObservedRunningTime="2026-02-17 16:46:41.932965578 +0000 UTC m=+269.690040912" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.934718 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-fmljb"] Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.934780 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.938713 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 16:46:41 crc kubenswrapper[4694]: I0217 16:46:41.965693 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.965664811 podStartE2EDuration="12.965664811s" podCreationTimestamp="2026-02-17 16:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:46:41.953346115 +0000 UTC m=+269.710421449" watchObservedRunningTime="2026-02-17 16:46:41.965664811 +0000 UTC m=+269.722740135" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.151099 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.339116 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.357349 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.359140 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.373496 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.583486 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.584769 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.636689 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.717238 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 16:46:42 crc kubenswrapper[4694]: I0217 16:46:42.904984 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" path="/var/lib/kubelet/pods/552639c4-d873-44a5-bbf1-0ada555d4d92/volumes" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.044566 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.171272 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.176394 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.269556 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.340275 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.433340 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.536640 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.616235 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.709345 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.722704 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.814920 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.845125 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 16:46:43 crc kubenswrapper[4694]: I0217 16:46:43.868023 4694 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.031553 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.032180 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.037279 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.056576 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.443932 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.456129 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.519364 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.537749 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.547087 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.697945 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.799304 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.832306 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.871099 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.885380 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.893214 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.924530 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.985774 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 16:46:44 crc kubenswrapper[4694]: I0217 16:46:44.995773 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.021233 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.024382 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.027433 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.101741 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.118190 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.199340 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.296818 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.362728 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.544401 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.567711 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.567980 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.574993 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.578552 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.647682 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.688128 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.708170 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.710574 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.785469 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.792350 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.830978 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.844708 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.926421 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.942989 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 16:46:45 crc kubenswrapper[4694]: I0217 16:46:45.983658 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.056661 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.185924 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.198105 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.227098 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.253013 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.317923 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.329009 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.377992 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.378128 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.419210 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.524747 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.754662 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.792852 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.837950 4694 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.870136 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.927320 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.941486 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 16:46:46 crc kubenswrapper[4694]: I0217 16:46:46.991659 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.064736 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.137812 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.242692 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.305581 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.394463 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.403076 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.406382 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.428744 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.478719 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.479704 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.502786 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.539519 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.604022 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.660429 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.672643 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.710030 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.712021 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.714455 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.715274 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.854402 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.964024 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 16:46:47 crc kubenswrapper[4694]: I0217 16:46:47.985962 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.178655 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.189469 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.242245 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.322117 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.448992 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.601403 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.840131 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.854846 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.883834 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.906084 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.927664 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.959842 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 16:46:48 crc kubenswrapper[4694]: I0217 16:46:48.983590 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.092053 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.111726 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.227042 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.362773 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.460249 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.464919 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.465248 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.594466 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.637423 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.650727 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.713321 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.717965 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d787499bd-wkmvm"] Feb 17 16:46:49 crc kubenswrapper[4694]: E0217 16:46:49.718703 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerName="oauth-openshift" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.718933 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerName="oauth-openshift" Feb 17 16:46:49 crc kubenswrapper[4694]: E0217 16:46:49.719147 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" containerName="installer" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.719325 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" containerName="installer" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.719774 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="552639c4-d873-44a5-bbf1-0ada555d4d92" containerName="oauth-openshift" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.720029 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69753e1-8094-4124-8ce3-7978d53239f6" containerName="installer" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.721109 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.724979 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.725582 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.725912 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.726117 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.726312 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.726501 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.726774 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.726960 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.727265 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.728165 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.728228 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.728169 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.738196 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.739720 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d787499bd-wkmvm"] Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.750306 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.750821 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.789589 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.801916 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.816057 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851370 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851415 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851441 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-policies\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851504 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-service-ca\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851555 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851595 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-login\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851638 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851682 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-error\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851707 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9ckc\" (UniqueName: \"kubernetes.io/projected/24b161c6-9de8-4d99-a325-6e9bfbafc51e-kube-api-access-c9ckc\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851743 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-dir\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851770 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-session\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851788 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851809 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.851835 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-router-certs\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.854632 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.878526 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.917197 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953280 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-login\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953369 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953450 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-error\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953497 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9ckc\" (UniqueName: \"kubernetes.io/projected/24b161c6-9de8-4d99-a325-6e9bfbafc51e-kube-api-access-c9ckc\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953573 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-dir\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953694 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-session\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953742 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953795 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953851 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-router-certs\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953914 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.953973 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.954042 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-policies\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.954097 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-service-ca\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.954147 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.954814 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.954922 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.955059 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-dir\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.955686 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-service-ca\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.955703 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24b161c6-9de8-4d99-a325-6e9bfbafc51e-audit-policies\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.960329 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-session\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.960500 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-router-certs\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.960742 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-login\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.961192 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.962079 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-error\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.963178 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.963653 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.965274 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24b161c6-9de8-4d99-a325-6e9bfbafc51e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.978730 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 16:46:49 crc kubenswrapper[4694]: I0217 16:46:49.985300 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9ckc\" (UniqueName: \"kubernetes.io/projected/24b161c6-9de8-4d99-a325-6e9bfbafc51e-kube-api-access-c9ckc\") pod \"oauth-openshift-d787499bd-wkmvm\" (UID: \"24b161c6-9de8-4d99-a325-6e9bfbafc51e\") " pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.069477 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.070038 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.219975 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.331363 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.408363 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.409954 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.478201 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.489845 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.622224 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.633902 4694 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.633949 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.634001 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.634778 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"267c0ecf492ed07987863d1462fc4a8cdc69e0019d02ab555f4797b78cc94704"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.634878 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://267c0ecf492ed07987863d1462fc4a8cdc69e0019d02ab555f4797b78cc94704" gracePeriod=30 Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.646299 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.652525 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.699776 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.705354 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.719268 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.764410 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.907643 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 16:46:50 crc kubenswrapper[4694]: I0217 16:46:50.930131 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.263251 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.267349 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.356312 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.488502 4694 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.559156 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.562349 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.563632 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.578487 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.584906 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d787499bd-wkmvm"] Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.623853 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.637202 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.839245 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.879053 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 16:46:51 crc kubenswrapper[4694]: I0217 16:46:51.896567 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.140459 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.145964 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" event={"ID":"24b161c6-9de8-4d99-a325-6e9bfbafc51e","Type":"ContainerStarted","Data":"99207ddf70db89d43c85add61d73ef4043b8e2dd446ba23d2ec1ffc657505d9b"} Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.146018 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" event={"ID":"24b161c6-9de8-4d99-a325-6e9bfbafc51e","Type":"ContainerStarted","Data":"678f72a15bb172e10cc843442c68e307647eff3491ee39069d27daa0c6a1e839"} Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.146259 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.147493 4694 patch_prober.go:28] interesting pod/oauth-openshift-d787499bd-wkmvm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.147549 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" podUID="24b161c6-9de8-4d99-a325-6e9bfbafc51e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.175570 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" podStartSLOduration=59.175547633 podStartE2EDuration="59.175547633s" podCreationTimestamp="2026-02-17 16:45:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:46:52.174153948 +0000 UTC m=+279.931229292" watchObservedRunningTime="2026-02-17 16:46:52.175547633 +0000 UTC m=+279.932622997" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.181653 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.230708 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.294419 4694 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.294649 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af" gracePeriod=5 Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.296005 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.362494 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.451354 4694 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.680169 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.746054 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.831501 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 16:46:52 crc kubenswrapper[4694]: I0217 16:46:52.938988 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.030749 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.095869 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.123360 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.147323 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.151447 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-d787499bd-wkmvm_24b161c6-9de8-4d99-a325-6e9bfbafc51e/oauth-openshift/0.log" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.151497 4694 generic.go:334] "Generic (PLEG): container finished" podID="24b161c6-9de8-4d99-a325-6e9bfbafc51e" containerID="99207ddf70db89d43c85add61d73ef4043b8e2dd446ba23d2ec1ffc657505d9b" exitCode=255 Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.151526 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" event={"ID":"24b161c6-9de8-4d99-a325-6e9bfbafc51e","Type":"ContainerDied","Data":"99207ddf70db89d43c85add61d73ef4043b8e2dd446ba23d2ec1ffc657505d9b"} Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.151998 4694 scope.go:117] "RemoveContainer" containerID="99207ddf70db89d43c85add61d73ef4043b8e2dd446ba23d2ec1ffc657505d9b" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.176629 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.201458 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.250847 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.268072 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.359801 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.404671 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.593129 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.614274 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.697653 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.946400 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 16:46:53 crc kubenswrapper[4694]: I0217 16:46:53.996244 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.033853 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.158772 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-d787499bd-wkmvm_24b161c6-9de8-4d99-a325-6e9bfbafc51e/oauth-openshift/0.log" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.159080 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" event={"ID":"24b161c6-9de8-4d99-a325-6e9bfbafc51e","Type":"ContainerStarted","Data":"1f1da77c4edbd6c80d6a6293b11d7017b6a0604218f1e5aa100999be76c070f1"} Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.160476 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.164985 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d787499bd-wkmvm" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.187906 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.238596 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.484025 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.650390 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 16:46:54 crc kubenswrapper[4694]: I0217 16:46:54.910901 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.137753 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.148001 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.165326 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.215687 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.248430 4694 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.344367 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.416459 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.491314 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 16:46:55 crc kubenswrapper[4694]: I0217 16:46:55.521351 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 16:46:56 crc kubenswrapper[4694]: I0217 16:46:56.163899 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 16:46:56 crc kubenswrapper[4694]: I0217 16:46:56.221232 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 16:46:57 crc kubenswrapper[4694]: I0217 16:46:57.873653 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 16:46:57 crc kubenswrapper[4694]: I0217 16:46:57.873725 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062135 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062508 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062732 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062988 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.063204 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062361 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.062934 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.063122 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.063259 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.064307 4694 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.064478 4694 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.065384 4694 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.065548 4694 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.075420 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.167640 4694 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.183700 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.183766 4694 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af" exitCode=137 Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.183810 4694 scope.go:117] "RemoveContainer" containerID="e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.183868 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.206444 4694 scope.go:117] "RemoveContainer" containerID="e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af" Feb 17 16:46:58 crc kubenswrapper[4694]: E0217 16:46:58.207025 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af\": container with ID starting with e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af not found: ID does not exist" containerID="e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.207067 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af"} err="failed to get container status \"e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af\": rpc error: code = NotFound desc = could not find container \"e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af\": container with ID starting with e16cce6a6e7560a3fb3954785562e20f6c51d7d8725e22ba21eb61251079f3af not found: ID does not exist" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.907277 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.907829 4694 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.919057 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.919094 4694 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0376e03a-f060-48b4-8e29-e05f572e0459" Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.923096 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 16:46:58 crc kubenswrapper[4694]: I0217 16:46:58.923123 4694 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0376e03a-f060-48b4-8e29-e05f572e0459" Feb 17 16:47:04 crc kubenswrapper[4694]: I0217 16:47:04.186976 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 16:47:06 crc kubenswrapper[4694]: I0217 16:47:06.001448 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 16:47:11 crc kubenswrapper[4694]: I0217 16:47:11.454036 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 16:47:12 crc kubenswrapper[4694]: I0217 16:47:12.244381 4694 generic.go:334] "Generic (PLEG): container finished" podID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerID="0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1" exitCode=0 Feb 17 16:47:12 crc kubenswrapper[4694]: I0217 16:47:12.244469 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerDied","Data":"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1"} Feb 17 16:47:12 crc kubenswrapper[4694]: I0217 16:47:12.245058 4694 scope.go:117] "RemoveContainer" containerID="0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1" Feb 17 16:47:12 crc kubenswrapper[4694]: I0217 16:47:12.710073 4694 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 16:47:13 crc kubenswrapper[4694]: I0217 16:47:13.252258 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerStarted","Data":"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607"} Feb 17 16:47:13 crc kubenswrapper[4694]: I0217 16:47:13.252932 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:47:13 crc kubenswrapper[4694]: I0217 16:47:13.255715 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:47:19 crc kubenswrapper[4694]: I0217 16:47:19.193144 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 16:47:19 crc kubenswrapper[4694]: I0217 16:47:19.875164 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 16:47:21 crc kubenswrapper[4694]: I0217 16:47:21.303236 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 16:47:21 crc kubenswrapper[4694]: I0217 16:47:21.306549 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 16:47:21 crc kubenswrapper[4694]: I0217 16:47:21.306597 4694 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="267c0ecf492ed07987863d1462fc4a8cdc69e0019d02ab555f4797b78cc94704" exitCode=137 Feb 17 16:47:21 crc kubenswrapper[4694]: I0217 16:47:21.306644 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"267c0ecf492ed07987863d1462fc4a8cdc69e0019d02ab555f4797b78cc94704"} Feb 17 16:47:21 crc kubenswrapper[4694]: I0217 16:47:21.306676 4694 scope.go:117] "RemoveContainer" containerID="0fdb1a1ddcaa0374d6c0940db4e7091861dabd8c792a55eca82f698db2a03e31" Feb 17 16:47:22 crc kubenswrapper[4694]: I0217 16:47:22.314378 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 17 16:47:22 crc kubenswrapper[4694]: I0217 16:47:22.316002 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b2255f1d19fdfde8e0f4e80f6a0d15638e61eaa8c07e53849720eb014bee3601"} Feb 17 16:47:23 crc kubenswrapper[4694]: I0217 16:47:23.793239 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 16:47:25 crc kubenswrapper[4694]: I0217 16:47:25.741302 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 16:47:26 crc kubenswrapper[4694]: I0217 16:47:26.838043 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 16:47:27 crc kubenswrapper[4694]: I0217 16:47:27.483819 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 16:47:29 crc kubenswrapper[4694]: I0217 16:47:29.426429 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 16:47:29 crc kubenswrapper[4694]: I0217 16:47:29.480056 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 16:47:30 crc kubenswrapper[4694]: I0217 16:47:30.463284 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:47:30 crc kubenswrapper[4694]: I0217 16:47:30.634245 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:47:30 crc kubenswrapper[4694]: I0217 16:47:30.639098 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:47:31 crc kubenswrapper[4694]: I0217 16:47:31.379772 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.333308 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.334021 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" podUID="d1c76767-8f16-4926-b632-8611bc27de87" containerName="route-controller-manager" containerID="cri-o://00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1" gracePeriod=30 Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.342769 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.343155 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerName="controller-manager" containerID="cri-o://887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43" gracePeriod=30 Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.756916 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.761529 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828347 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghtf5\" (UniqueName: \"kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5\") pod \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828427 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca\") pod \"d1c76767-8f16-4926-b632-8611bc27de87\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828448 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config\") pod \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828475 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4cp9\" (UniqueName: \"kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9\") pod \"d1c76767-8f16-4926-b632-8611bc27de87\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828492 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert\") pod \"d1c76767-8f16-4926-b632-8611bc27de87\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828530 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles\") pod \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828558 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config\") pod \"d1c76767-8f16-4926-b632-8611bc27de87\" (UID: \"d1c76767-8f16-4926-b632-8611bc27de87\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828587 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert\") pod \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.828650 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca\") pod \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\" (UID: \"eea272da-4da9-4f26-b66c-1aba9bbde6bc\") " Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.829643 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "eea272da-4da9-4f26-b66c-1aba9bbde6bc" (UID: "eea272da-4da9-4f26-b66c-1aba9bbde6bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.829889 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config" (OuterVolumeSpecName: "config") pod "d1c76767-8f16-4926-b632-8611bc27de87" (UID: "d1c76767-8f16-4926-b632-8611bc27de87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.829898 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config" (OuterVolumeSpecName: "config") pod "eea272da-4da9-4f26-b66c-1aba9bbde6bc" (UID: "eea272da-4da9-4f26-b66c-1aba9bbde6bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.829991 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eea272da-4da9-4f26-b66c-1aba9bbde6bc" (UID: "eea272da-4da9-4f26-b66c-1aba9bbde6bc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.830018 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1c76767-8f16-4926-b632-8611bc27de87" (UID: "d1c76767-8f16-4926-b632-8611bc27de87"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.838698 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5" (OuterVolumeSpecName: "kube-api-access-ghtf5") pod "eea272da-4da9-4f26-b66c-1aba9bbde6bc" (UID: "eea272da-4da9-4f26-b66c-1aba9bbde6bc"). InnerVolumeSpecName "kube-api-access-ghtf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.838713 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1c76767-8f16-4926-b632-8611bc27de87" (UID: "d1c76767-8f16-4926-b632-8611bc27de87"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.839047 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9" (OuterVolumeSpecName: "kube-api-access-c4cp9") pod "d1c76767-8f16-4926-b632-8611bc27de87" (UID: "d1c76767-8f16-4926-b632-8611bc27de87"). InnerVolumeSpecName "kube-api-access-c4cp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.840382 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eea272da-4da9-4f26-b66c-1aba9bbde6bc" (UID: "eea272da-4da9-4f26-b66c-1aba9bbde6bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930178 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4cp9\" (UniqueName: \"kubernetes.io/projected/d1c76767-8f16-4926-b632-8611bc27de87-kube-api-access-c4cp9\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930206 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1c76767-8f16-4926-b632-8611bc27de87-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930215 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930225 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930234 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eea272da-4da9-4f26-b66c-1aba9bbde6bc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930242 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930250 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghtf5\" (UniqueName: \"kubernetes.io/projected/eea272da-4da9-4f26-b66c-1aba9bbde6bc-kube-api-access-ghtf5\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930258 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eea272da-4da9-4f26-b66c-1aba9bbde6bc-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:40 crc kubenswrapper[4694]: I0217 16:47:40.930265 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1c76767-8f16-4926-b632-8611bc27de87-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411395 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:47:41 crc kubenswrapper[4694]: E0217 16:47:41.411658 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411672 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 16:47:41 crc kubenswrapper[4694]: E0217 16:47:41.411682 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerName="controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411688 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerName="controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: E0217 16:47:41.411696 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c76767-8f16-4926-b632-8611bc27de87" containerName="route-controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411702 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c76767-8f16-4926-b632-8611bc27de87" containerName="route-controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411784 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c76767-8f16-4926-b632-8611bc27de87" containerName="route-controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411793 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerName="controller-manager" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.411805 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.412204 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.414786 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.415440 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.426180 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.426817 4694 generic.go:334] "Generic (PLEG): container finished" podID="d1c76767-8f16-4926-b632-8611bc27de87" containerID="00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1" exitCode=0 Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.426867 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.426859 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" event={"ID":"d1c76767-8f16-4926-b632-8611bc27de87","Type":"ContainerDied","Data":"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1"} Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.426992 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj" event={"ID":"d1c76767-8f16-4926-b632-8611bc27de87","Type":"ContainerDied","Data":"aff6e7835dc8589138a916afdf29efc0819f7f4adbaecc79097c7cdb57bebad4"} Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.427037 4694 scope.go:117] "RemoveContainer" containerID="00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.428504 4694 generic.go:334] "Generic (PLEG): container finished" podID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" containerID="887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43" exitCode=0 Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.428527 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" event={"ID":"eea272da-4da9-4f26-b66c-1aba9bbde6bc","Type":"ContainerDied","Data":"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43"} Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.428543 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" event={"ID":"eea272da-4da9-4f26-b66c-1aba9bbde6bc","Type":"ContainerDied","Data":"8fa6649b2529f89760c1210aae6991ed89ebc761d6a4f8f2c732056c9e56df54"} Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.428572 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vgfk2" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.429365 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.451153 4694 scope.go:117] "RemoveContainer" containerID="00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1" Feb 17 16:47:41 crc kubenswrapper[4694]: E0217 16:47:41.452391 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1\": container with ID starting with 00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1 not found: ID does not exist" containerID="00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.452505 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1"} err="failed to get container status \"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1\": rpc error: code = NotFound desc = could not find container \"00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1\": container with ID starting with 00046de53e0534277433d3a22523e9f7099edf012a337b081d6a4bc9571dc1e1 not found: ID does not exist" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.452564 4694 scope.go:117] "RemoveContainer" containerID="887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.471177 4694 scope.go:117] "RemoveContainer" containerID="887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43" Feb 17 16:47:41 crc kubenswrapper[4694]: E0217 16:47:41.471680 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43\": container with ID starting with 887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43 not found: ID does not exist" containerID="887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.471731 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43"} err="failed to get container status \"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43\": rpc error: code = NotFound desc = could not find container \"887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43\": container with ID starting with 887022bc7fa7be3999aafdc63b7dafce18412a1c5df3954cbbdd8d965ea10c43 not found: ID does not exist" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.495384 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.499117 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7vtpj"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.503410 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.507685 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vgfk2"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537122 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537405 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537497 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svqj\" (UniqueName: \"kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537578 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537702 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537775 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537843 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcfr\" (UniqueName: \"kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.537927 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.538015 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.638921 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.639170 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.639285 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9svqj\" (UniqueName: \"kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.639393 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.639939 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.640053 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.640160 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcfr\" (UniqueName: \"kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.640258 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.640363 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.640900 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.641756 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.641986 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.642270 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.643281 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.643372 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.643996 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.658129 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svqj\" (UniqueName: \"kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj\") pod \"controller-manager-866ddd4d9c-5l7ll\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.659387 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcfr\" (UniqueName: \"kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr\") pod \"route-controller-manager-8b979b485-pp2zl\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.731105 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.759431 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.949324 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:47:41 crc kubenswrapper[4694]: I0217 16:47:41.988037 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.435064 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" event={"ID":"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0","Type":"ContainerStarted","Data":"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d"} Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.435367 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" event={"ID":"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0","Type":"ContainerStarted","Data":"f1a1f68a2eac4ea748a87d8edaddf17690b8fce51075c1228e90bfbc3170a60a"} Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.437331 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.440667 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" event={"ID":"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08","Type":"ContainerStarted","Data":"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6"} Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.440699 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" event={"ID":"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08","Type":"ContainerStarted","Data":"878bc40bbb82ee5def45c18348efb55885683b0e23690256aac9903d39e86d46"} Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.441531 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.443200 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.458707 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" podStartSLOduration=2.4586880779999998 podStartE2EDuration="2.458688078s" podCreationTimestamp="2026-02-17 16:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:47:42.456145456 +0000 UTC m=+330.213220790" watchObservedRunningTime="2026-02-17 16:47:42.458688078 +0000 UTC m=+330.215763392" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.518891 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" podStartSLOduration=2.5188717819999997 podStartE2EDuration="2.518871782s" podCreationTimestamp="2026-02-17 16:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:47:42.515311115 +0000 UTC m=+330.272386439" watchObservedRunningTime="2026-02-17 16:47:42.518871782 +0000 UTC m=+330.275947106" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.557516 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.903545 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c76767-8f16-4926-b632-8611bc27de87" path="/var/lib/kubelet/pods/d1c76767-8f16-4926-b632-8611bc27de87/volumes" Feb 17 16:47:42 crc kubenswrapper[4694]: I0217 16:47:42.904160 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea272da-4da9-4f26-b66c-1aba9bbde6bc" path="/var/lib/kubelet/pods/eea272da-4da9-4f26-b66c-1aba9bbde6bc/volumes" Feb 17 16:47:44 crc kubenswrapper[4694]: I0217 16:47:44.617643 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:47:44 crc kubenswrapper[4694]: I0217 16:47:44.618021 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:48:06 crc kubenswrapper[4694]: I0217 16:48:06.517725 4694 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rzdk7 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 16:48:06 crc kubenswrapper[4694]: I0217 16:48:06.518433 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" podUID="3d017650-73fd-4db1-958d-7bca865a125b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 16:48:06 crc kubenswrapper[4694]: I0217 16:48:06.517725 4694 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rzdk7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 16:48:06 crc kubenswrapper[4694]: I0217 16:48:06.518582 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rzdk7" podUID="3d017650-73fd-4db1-958d-7bca865a125b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 16:48:11 crc kubenswrapper[4694]: I0217 16:48:11.961131 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tt4fk"] Feb 17 16:48:11 crc kubenswrapper[4694]: I0217 16:48:11.962246 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:11 crc kubenswrapper[4694]: I0217 16:48:11.973854 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tt4fk"] Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154184 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154234 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154254 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-trusted-ca\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154270 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8bxn\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-kube-api-access-m8bxn\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154308 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-certificates\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154339 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154355 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-bound-sa-token\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.154370 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-tls\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.175488 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.255883 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-certificates\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256323 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-bound-sa-token\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256361 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-tls\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256425 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256471 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256505 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bxn\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-kube-api-access-m8bxn\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.256532 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-trusted-ca\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.257775 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.257902 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-trusted-ca\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.258183 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-certificates\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.264490 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-registry-tls\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.268374 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.280181 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-bound-sa-token\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.280679 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bxn\" (UniqueName: \"kubernetes.io/projected/20150a9b-b68c-48e6-8ff0-5f2d9add98e1-kube-api-access-m8bxn\") pod \"image-registry-66df7c8f76-tt4fk\" (UID: \"20150a9b-b68c-48e6-8ff0-5f2d9add98e1\") " pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.283808 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:12 crc kubenswrapper[4694]: I0217 16:48:12.702699 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tt4fk"] Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.530048 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" event={"ID":"20150a9b-b68c-48e6-8ff0-5f2d9add98e1","Type":"ContainerStarted","Data":"cde4f667e22c740b8b6ebd37ab1fd504d35639e98c7b16cbf03a15230f568f34"} Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.530538 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" event={"ID":"20150a9b-b68c-48e6-8ff0-5f2d9add98e1","Type":"ContainerStarted","Data":"9b6c0313f33c2e4522da3bdb135799f5ad4e988b3fd8399d52cb23568e449f15"} Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.530556 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.552836 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" podStartSLOduration=2.552815573 podStartE2EDuration="2.552815573s" podCreationTimestamp="2026-02-17 16:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:48:13.546753995 +0000 UTC m=+361.303829419" watchObservedRunningTime="2026-02-17 16:48:13.552815573 +0000 UTC m=+361.309890907" Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.706699 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:48:13 crc kubenswrapper[4694]: I0217 16:48:13.706957 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" podUID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" containerName="route-controller-manager" containerID="cri-o://91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6" gracePeriod=30 Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.138859 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.281398 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert\") pod \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.281473 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrcfr\" (UniqueName: \"kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr\") pod \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.281508 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config\") pod \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.281544 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca\") pod \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\" (UID: \"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08\") " Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.282542 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca" (OuterVolumeSpecName: "client-ca") pod "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" (UID: "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.282550 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config" (OuterVolumeSpecName: "config") pod "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" (UID: "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.287571 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" (UID: "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.287878 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr" (OuterVolumeSpecName: "kube-api-access-wrcfr") pod "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" (UID: "2891806a-dbc6-4cc0-a71f-9a4e3e81bc08"). InnerVolumeSpecName "kube-api-access-wrcfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.382787 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.382839 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.382852 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrcfr\" (UniqueName: \"kubernetes.io/projected/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-kube-api-access-wrcfr\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.382865 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.536079 4694 generic.go:334] "Generic (PLEG): container finished" podID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" containerID="91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6" exitCode=0 Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.536135 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" event={"ID":"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08","Type":"ContainerDied","Data":"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6"} Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.536168 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" event={"ID":"2891806a-dbc6-4cc0-a71f-9a4e3e81bc08","Type":"ContainerDied","Data":"878bc40bbb82ee5def45c18348efb55885683b0e23690256aac9903d39e86d46"} Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.536166 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.536183 4694 scope.go:117] "RemoveContainer" containerID="91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.551670 4694 scope.go:117] "RemoveContainer" containerID="91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6" Feb 17 16:48:14 crc kubenswrapper[4694]: E0217 16:48:14.551987 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6\": container with ID starting with 91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6 not found: ID does not exist" containerID="91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.552023 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6"} err="failed to get container status \"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6\": rpc error: code = NotFound desc = could not find container \"91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6\": container with ID starting with 91dd10bc9fb70767c696bf042145a8688daa9b4e30c50a263460a89c9fb41ce6 not found: ID does not exist" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.560112 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.566460 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8b979b485-pp2zl"] Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.618060 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.618112 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:48:14 crc kubenswrapper[4694]: I0217 16:48:14.902373 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" path="/var/lib/kubelet/pods/2891806a-dbc6-4cc0-a71f-9a4e3e81bc08/volumes" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.507044 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts"] Feb 17 16:48:15 crc kubenswrapper[4694]: E0217 16:48:15.507672 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" containerName="route-controller-manager" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.507700 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" containerName="route-controller-manager" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.507885 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2891806a-dbc6-4cc0-a71f-9a4e3e81bc08" containerName="route-controller-manager" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.508433 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.510497 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.510506 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.511418 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.511716 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.512019 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.512208 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.516764 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts"] Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.698467 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-client-ca\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.698534 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z94zv\" (UniqueName: \"kubernetes.io/projected/4773ee54-7193-4a63-a7c8-b1d97b42492e-kube-api-access-z94zv\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.698562 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4773ee54-7193-4a63-a7c8-b1d97b42492e-serving-cert\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.698652 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-config\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.799804 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z94zv\" (UniqueName: \"kubernetes.io/projected/4773ee54-7193-4a63-a7c8-b1d97b42492e-kube-api-access-z94zv\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.799869 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4773ee54-7193-4a63-a7c8-b1d97b42492e-serving-cert\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.799901 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-config\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.799970 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-client-ca\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.800994 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-client-ca\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.801228 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4773ee54-7193-4a63-a7c8-b1d97b42492e-config\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.812236 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4773ee54-7193-4a63-a7c8-b1d97b42492e-serving-cert\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.816531 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z94zv\" (UniqueName: \"kubernetes.io/projected/4773ee54-7193-4a63-a7c8-b1d97b42492e-kube-api-access-z94zv\") pod \"route-controller-manager-768d54bfd9-ndvts\" (UID: \"4773ee54-7193-4a63-a7c8-b1d97b42492e\") " pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:15 crc kubenswrapper[4694]: I0217 16:48:15.890222 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.290899 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts"] Feb 17 16:48:16 crc kubenswrapper[4694]: W0217 16:48:16.296551 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4773ee54_7193_4a63_a7c8_b1d97b42492e.slice/crio-3731b5359ab21e0f58510a3e5ea6891d904c7a1a03825f27a4c7d9d99c11e3e4 WatchSource:0}: Error finding container 3731b5359ab21e0f58510a3e5ea6891d904c7a1a03825f27a4c7d9d99c11e3e4: Status 404 returned error can't find the container with id 3731b5359ab21e0f58510a3e5ea6891d904c7a1a03825f27a4c7d9d99c11e3e4 Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.594678 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" event={"ID":"4773ee54-7193-4a63-a7c8-b1d97b42492e","Type":"ContainerStarted","Data":"2e72a27a834615fe9bdf54210aeaab2b9b8f2e85929b62e8600ccccd8b72246b"} Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.594721 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" event={"ID":"4773ee54-7193-4a63-a7c8-b1d97b42492e","Type":"ContainerStarted","Data":"3731b5359ab21e0f58510a3e5ea6891d904c7a1a03825f27a4c7d9d99c11e3e4"} Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.595164 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.900618 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" Feb 17 16:48:16 crc kubenswrapper[4694]: I0217 16:48:16.917551 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-768d54bfd9-ndvts" podStartSLOduration=3.9175340309999998 podStartE2EDuration="3.917534031s" podCreationTimestamp="2026-02-17 16:48:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:48:16.60882296 +0000 UTC m=+364.365898294" watchObservedRunningTime="2026-02-17 16:48:16.917534031 +0000 UTC m=+364.674609355" Feb 17 16:48:32 crc kubenswrapper[4694]: I0217 16:48:32.290436 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tt4fk" Feb 17 16:48:32 crc kubenswrapper[4694]: I0217 16:48:32.390952 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:48:33 crc kubenswrapper[4694]: I0217 16:48:33.695716 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:48:33 crc kubenswrapper[4694]: I0217 16:48:33.695971 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" podUID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" containerName="controller-manager" containerID="cri-o://8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d" gracePeriod=30 Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.065256 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.191205 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9svqj\" (UniqueName: \"kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj\") pod \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.191274 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca\") pod \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.191311 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert\") pod \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.191347 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles\") pod \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.191392 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config\") pod \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\" (UID: \"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0\") " Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.192408 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" (UID: "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.192451 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" (UID: "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.192515 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config" (OuterVolumeSpecName: "config") pod "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" (UID: "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.196856 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" (UID: "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.203651 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj" (OuterVolumeSpecName: "kube-api-access-9svqj") pod "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" (UID: "dc58be6c-7d6f-4890-8bb3-6eb2c51732b0"). InnerVolumeSpecName "kube-api-access-9svqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.292967 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9svqj\" (UniqueName: \"kubernetes.io/projected/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-kube-api-access-9svqj\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.293654 4694 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.293786 4694 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.293871 4694 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.293955 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.693952 4694 generic.go:334] "Generic (PLEG): container finished" podID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" containerID="8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d" exitCode=0 Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.693999 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" event={"ID":"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0","Type":"ContainerDied","Data":"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d"} Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.694009 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.694027 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll" event={"ID":"dc58be6c-7d6f-4890-8bb3-6eb2c51732b0","Type":"ContainerDied","Data":"f1a1f68a2eac4ea748a87d8edaddf17690b8fce51075c1228e90bfbc3170a60a"} Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.694045 4694 scope.go:117] "RemoveContainer" containerID="8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.711648 4694 scope.go:117] "RemoveContainer" containerID="8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d" Feb 17 16:48:34 crc kubenswrapper[4694]: E0217 16:48:34.712220 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d\": container with ID starting with 8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d not found: ID does not exist" containerID="8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.712256 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d"} err="failed to get container status \"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d\": rpc error: code = NotFound desc = could not find container \"8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d\": container with ID starting with 8ddff07f70abfc850148d26e9d135ec67c1dbc8f310305688e81513a6d3b0b0d not found: ID does not exist" Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.722592 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.726203 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-866ddd4d9c-5l7ll"] Feb 17 16:48:34 crc kubenswrapper[4694]: I0217 16:48:34.903068 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" path="/var/lib/kubelet/pods/dc58be6c-7d6f-4890-8bb3-6eb2c51732b0/volumes" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.517116 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77b9d49cf-bvm52"] Feb 17 16:48:35 crc kubenswrapper[4694]: E0217 16:48:35.517564 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" containerName="controller-manager" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.517576 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" containerName="controller-manager" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.517690 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc58be6c-7d6f-4890-8bb3-6eb2c51732b0" containerName="controller-manager" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.518042 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.521277 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.522822 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.522837 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.522998 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.522913 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.525054 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.529902 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.543071 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b9d49cf-bvm52"] Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.611418 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-client-ca\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.611690 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8s2\" (UniqueName: \"kubernetes.io/projected/336aa5bb-f55e-42af-81b0-fde8a33df5fb-kube-api-access-cb8s2\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.611831 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336aa5bb-f55e-42af-81b0-fde8a33df5fb-serving-cert\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.611955 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-proxy-ca-bundles\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.612041 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-config\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.712959 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-proxy-ca-bundles\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.713008 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-config\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.713073 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-client-ca\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.713107 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8s2\" (UniqueName: \"kubernetes.io/projected/336aa5bb-f55e-42af-81b0-fde8a33df5fb-kube-api-access-cb8s2\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.713152 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336aa5bb-f55e-42af-81b0-fde8a33df5fb-serving-cert\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.715136 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-proxy-ca-bundles\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.716340 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-client-ca\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.716660 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336aa5bb-f55e-42af-81b0-fde8a33df5fb-config\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.721914 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336aa5bb-f55e-42af-81b0-fde8a33df5fb-serving-cert\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.738626 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8s2\" (UniqueName: \"kubernetes.io/projected/336aa5bb-f55e-42af-81b0-fde8a33df5fb-kube-api-access-cb8s2\") pod \"controller-manager-77b9d49cf-bvm52\" (UID: \"336aa5bb-f55e-42af-81b0-fde8a33df5fb\") " pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:35 crc kubenswrapper[4694]: I0217 16:48:35.838688 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.274159 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b9d49cf-bvm52"] Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.708331 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" event={"ID":"336aa5bb-f55e-42af-81b0-fde8a33df5fb","Type":"ContainerStarted","Data":"9160154fd52fdfa84226abc6f5ecb64f846cd8b88bbff9ad3bbd31c8a1dca68c"} Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.708372 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" event={"ID":"336aa5bb-f55e-42af-81b0-fde8a33df5fb","Type":"ContainerStarted","Data":"cc0e439eaae82d018e8bbe46d4402b5cafd5542a3eff6fd3a744f30864acdb48"} Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.708676 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.714587 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" Feb 17 16:48:36 crc kubenswrapper[4694]: I0217 16:48:36.751870 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77b9d49cf-bvm52" podStartSLOduration=3.751854024 podStartE2EDuration="3.751854024s" podCreationTimestamp="2026-02-17 16:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:48:36.729871566 +0000 UTC m=+384.486946880" watchObservedRunningTime="2026-02-17 16:48:36.751854024 +0000 UTC m=+384.508929338" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.618319 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.619333 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.619424 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.620448 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.621464 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8" gracePeriod=600 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.730462 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.730830 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gp2g7" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="registry-server" containerID="cri-o://4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4" gracePeriod=30 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.762544 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.763016 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rmjbf" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="registry-server" containerID="cri-o://244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5" gracePeriod=30 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.765922 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.766242 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" containerID="cri-o://ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607" gracePeriod=30 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.773478 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.773759 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tx6l2" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="registry-server" containerID="cri-o://f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9" gracePeriod=30 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.774589 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.775102 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xsqk6" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="registry-server" containerID="cri-o://2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa" gracePeriod=30 Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.789422 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82n9"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.790065 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.801705 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82n9"] Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.843323 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.843982 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.844044 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsw72\" (UniqueName: \"kubernetes.io/projected/80752637-17b7-451f-a4f9-c15ff9d5bd47-kube-api-access-qsw72\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.944842 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.945198 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.945217 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsw72\" (UniqueName: \"kubernetes.io/projected/80752637-17b7-451f-a4f9-c15ff9d5bd47-kube-api-access-qsw72\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.946097 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.955189 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/80752637-17b7-451f-a4f9-c15ff9d5bd47-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:44 crc kubenswrapper[4694]: I0217 16:48:44.961705 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsw72\" (UniqueName: \"kubernetes.io/projected/80752637-17b7-451f-a4f9-c15ff9d5bd47-kube-api-access-qsw72\") pod \"marketplace-operator-79b997595-w82n9\" (UID: \"80752637-17b7-451f-a4f9-c15ff9d5bd47\") " pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.261982 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.297198 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.452651 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities\") pod \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.452772 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content\") pod \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.452842 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsrhd\" (UniqueName: \"kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd\") pod \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\" (UID: \"c7a9bea3-8150-4246-9c2b-dd9d57e17f30\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.453802 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities" (OuterVolumeSpecName: "utilities") pod "c7a9bea3-8150-4246-9c2b-dd9d57e17f30" (UID: "c7a9bea3-8150-4246-9c2b-dd9d57e17f30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.457776 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd" (OuterVolumeSpecName: "kube-api-access-bsrhd") pod "c7a9bea3-8150-4246-9c2b-dd9d57e17f30" (UID: "c7a9bea3-8150-4246-9c2b-dd9d57e17f30"). InnerVolumeSpecName "kube-api-access-bsrhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.491798 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.496256 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.501770 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.553968 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.553995 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsrhd\" (UniqueName: \"kubernetes.io/projected/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-kube-api-access-bsrhd\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.556113 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7a9bea3-8150-4246-9c2b-dd9d57e17f30" (UID: "c7a9bea3-8150-4246-9c2b-dd9d57e17f30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.557323 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.655320 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b72ks\" (UniqueName: \"kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks\") pod \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.655657 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content\") pod \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656044 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities\") pod \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\" (UID: \"0f68e586-955c-4c2c-8b3e-a91f6b95a442\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656110 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtzbw\" (UniqueName: \"kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw\") pod \"f2002375-3db0-44d4-8c8d-e945a20a38d9\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656151 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content\") pod \"f2002375-3db0-44d4-8c8d-e945a20a38d9\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656176 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content\") pod \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656249 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities\") pod \"f2002375-3db0-44d4-8c8d-e945a20a38d9\" (UID: \"f2002375-3db0-44d4-8c8d-e945a20a38d9\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656282 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities\") pod \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656304 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g7vx\" (UniqueName: \"kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx\") pod \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\" (UID: \"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656330 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc27r\" (UniqueName: \"kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r\") pod \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656358 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca\") pod \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656396 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics\") pod \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\" (UID: \"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c\") " Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.656789 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a9bea3-8150-4246-9c2b-dd9d57e17f30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.657103 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities" (OuterVolumeSpecName: "utilities") pod "f2002375-3db0-44d4-8c8d-e945a20a38d9" (UID: "f2002375-3db0-44d4-8c8d-e945a20a38d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.657450 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities" (OuterVolumeSpecName: "utilities") pod "0f68e586-955c-4c2c-8b3e-a91f6b95a442" (UID: "0f68e586-955c-4c2c-8b3e-a91f6b95a442"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.658762 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities" (OuterVolumeSpecName: "utilities") pod "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" (UID: "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.659162 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" (UID: "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.659901 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks" (OuterVolumeSpecName: "kube-api-access-b72ks") pod "0f68e586-955c-4c2c-8b3e-a91f6b95a442" (UID: "0f68e586-955c-4c2c-8b3e-a91f6b95a442"). InnerVolumeSpecName "kube-api-access-b72ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.660055 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx" (OuterVolumeSpecName: "kube-api-access-9g7vx") pod "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" (UID: "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4"). InnerVolumeSpecName "kube-api-access-9g7vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.664101 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r" (OuterVolumeSpecName: "kube-api-access-vc27r") pod "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" (UID: "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c"). InnerVolumeSpecName "kube-api-access-vc27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.664757 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw" (OuterVolumeSpecName: "kube-api-access-mtzbw") pod "f2002375-3db0-44d4-8c8d-e945a20a38d9" (UID: "f2002375-3db0-44d4-8c8d-e945a20a38d9"). InnerVolumeSpecName "kube-api-access-mtzbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.664930 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" (UID: "4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.708845 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" (UID: "7ff751c3-1af4-4a0d-b057-302ecb2d7bd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.710031 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f68e586-955c-4c2c-8b3e-a91f6b95a442" (UID: "0f68e586-955c-4c2c-8b3e-a91f6b95a442"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758289 4694 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758327 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758339 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b72ks\" (UniqueName: \"kubernetes.io/projected/0f68e586-955c-4c2c-8b3e-a91f6b95a442-kube-api-access-b72ks\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758351 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f68e586-955c-4c2c-8b3e-a91f6b95a442-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758363 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtzbw\" (UniqueName: \"kubernetes.io/projected/f2002375-3db0-44d4-8c8d-e945a20a38d9-kube-api-access-mtzbw\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758374 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758385 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758396 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758407 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g7vx\" (UniqueName: \"kubernetes.io/projected/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4-kube-api-access-9g7vx\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758419 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc27r\" (UniqueName: \"kubernetes.io/projected/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-kube-api-access-vc27r\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.758434 4694 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.772437 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w82n9"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.779049 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerID="2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.779135 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerDied","Data":"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.779147 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsqk6" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.779169 4694 scope.go:117] "RemoveContainer" containerID="2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.779158 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsqk6" event={"ID":"f2002375-3db0-44d4-8c8d-e945a20a38d9","Type":"ContainerDied","Data":"53a9252944df2a649d7fa594975084f97637978bbedaf6352738b6eb540b8505"} Feb 17 16:48:45 crc kubenswrapper[4694]: W0217 16:48:45.782829 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80752637_17b7_451f_a4f9_c15ff9d5bd47.slice/crio-23edf7effa9fc8ef047f2003af2c35430683b9389978dc477fb21cdce5090fc1 WatchSource:0}: Error finding container 23edf7effa9fc8ef047f2003af2c35430683b9389978dc477fb21cdce5090fc1: Status 404 returned error can't find the container with id 23edf7effa9fc8ef047f2003af2c35430683b9389978dc477fb21cdce5090fc1 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.783949 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2002375-3db0-44d4-8c8d-e945a20a38d9" (UID: "f2002375-3db0-44d4-8c8d-e945a20a38d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.784455 4694 generic.go:334] "Generic (PLEG): container finished" podID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerID="4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.784599 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gp2g7" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.784792 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerDied","Data":"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.784989 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gp2g7" event={"ID":"c7a9bea3-8150-4246-9c2b-dd9d57e17f30","Type":"ContainerDied","Data":"7e7a624116248b88f06db40151e24eb96e33f825e578b426938f5b7077733c56"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.787947 4694 generic.go:334] "Generic (PLEG): container finished" podID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerID="ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.788356 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerDied","Data":"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.788395 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" event={"ID":"4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c","Type":"ContainerDied","Data":"b1f872531a34ead573707e214f0c8bb471608d5bf02b07c15bebccba41f21764"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.792360 4694 generic.go:334] "Generic (PLEG): container finished" podID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerID="244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.792429 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerDied","Data":"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.792460 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmjbf" event={"ID":"0f68e586-955c-4c2c-8b3e-a91f6b95a442","Type":"ContainerDied","Data":"0edb71418bfca3ec158f3eb0b03d5c1e63917a21058205ba15df3f70e441bcfd"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.792531 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmjbf" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.793787 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.798591 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.798669 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.798692 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.802077 4694 generic.go:334] "Generic (PLEG): container finished" podID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerID="f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9" exitCode=0 Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.802102 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerDied","Data":"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.802118 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tx6l2" event={"ID":"7ff751c3-1af4-4a0d-b057-302ecb2d7bd4","Type":"ContainerDied","Data":"ff6b22c4977dc85ec19235e04dd66f15a6411b4bd67624e678865f431eab57d9"} Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.802159 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tx6l2" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.828262 4694 scope.go:117] "RemoveContainer" containerID="d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.861291 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2002375-3db0-44d4-8c8d-e945a20a38d9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.867262 4694 scope.go:117] "RemoveContainer" containerID="db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.895742 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.903002 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tx6l2"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.908992 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.916362 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rmjbf"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.916529 4694 scope.go:117] "RemoveContainer" containerID="2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.917009 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa\": container with ID starting with 2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa not found: ID does not exist" containerID="2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.917084 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa"} err="failed to get container status \"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa\": rpc error: code = NotFound desc = could not find container \"2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa\": container with ID starting with 2acd5e59c2b5406ac87c7936a22fc631ce27dab1b9c400624465be3a26d956aa not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.917139 4694 scope.go:117] "RemoveContainer" containerID="d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.917410 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78\": container with ID starting with d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78 not found: ID does not exist" containerID="d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.917493 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78"} err="failed to get container status \"d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78\": rpc error: code = NotFound desc = could not find container \"d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78\": container with ID starting with d2444ab599e9d890bca13a0bc6e3dc3eb9514b7b34ca87019322b376ec889a78 not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.917559 4694 scope.go:117] "RemoveContainer" containerID="db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.917948 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64\": container with ID starting with db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64 not found: ID does not exist" containerID="db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.917975 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64"} err="failed to get container status \"db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64\": rpc error: code = NotFound desc = could not find container \"db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64\": container with ID starting with db84364425c3945486f9019fe9ecd33e54c6c546292004052565a02de2844b64 not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.918024 4694 scope.go:117] "RemoveContainer" containerID="4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.924046 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.928120 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gp2g7"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.935855 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.936269 4694 scope.go:117] "RemoveContainer" containerID="fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.938589 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t8q2l"] Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.953819 4694 scope.go:117] "RemoveContainer" containerID="2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.968649 4694 scope.go:117] "RemoveContainer" containerID="4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.969147 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4\": container with ID starting with 4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4 not found: ID does not exist" containerID="4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.969194 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4"} err="failed to get container status \"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4\": rpc error: code = NotFound desc = could not find container \"4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4\": container with ID starting with 4551c2c29e13eafee98ab3ab992f41617ffc64c7fe6d4c493cb599a3c732b8a4 not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.969222 4694 scope.go:117] "RemoveContainer" containerID="fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.969564 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913\": container with ID starting with fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913 not found: ID does not exist" containerID="fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.969601 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913"} err="failed to get container status \"fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913\": rpc error: code = NotFound desc = could not find container \"fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913\": container with ID starting with fcef4e2c427d413950ae213ae0d1b339220c630a4088b9c88c7e803e9bf30913 not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.969649 4694 scope.go:117] "RemoveContainer" containerID="2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e" Feb 17 16:48:45 crc kubenswrapper[4694]: E0217 16:48:45.969970 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e\": container with ID starting with 2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e not found: ID does not exist" containerID="2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.970025 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e"} err="failed to get container status \"2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e\": rpc error: code = NotFound desc = could not find container \"2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e\": container with ID starting with 2068b0b99052d9c49cd4fe013526941180c0a653ac13466d396568894530f30e not found: ID does not exist" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.970041 4694 scope.go:117] "RemoveContainer" containerID="ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607" Feb 17 16:48:45 crc kubenswrapper[4694]: I0217 16:48:45.981713 4694 scope.go:117] "RemoveContainer" containerID="0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.003429 4694 scope.go:117] "RemoveContainer" containerID="ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.004900 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607\": container with ID starting with ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607 not found: ID does not exist" containerID="ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.005400 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607"} err="failed to get container status \"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607\": rpc error: code = NotFound desc = could not find container \"ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607\": container with ID starting with ad28f1b824da6d572ccaaa711206adfbe5d64fb974c7ba052936f546e28e9607 not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.005435 4694 scope.go:117] "RemoveContainer" containerID="0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.005880 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1\": container with ID starting with 0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1 not found: ID does not exist" containerID="0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.005922 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1"} err="failed to get container status \"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1\": rpc error: code = NotFound desc = could not find container \"0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1\": container with ID starting with 0d7cd12a8c53119381cc018da977e938bdd2439a7ac053e8d7e4d358698bfef1 not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.005950 4694 scope.go:117] "RemoveContainer" containerID="244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.017758 4694 scope.go:117] "RemoveContainer" containerID="c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.033821 4694 scope.go:117] "RemoveContainer" containerID="2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.047788 4694 scope.go:117] "RemoveContainer" containerID="244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.048132 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5\": container with ID starting with 244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5 not found: ID does not exist" containerID="244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048164 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5"} err="failed to get container status \"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5\": rpc error: code = NotFound desc = could not find container \"244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5\": container with ID starting with 244428df6952ca85a10529c6e0ee649a964423ddc1f5c295b34acd838a25b6a5 not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048193 4694 scope.go:117] "RemoveContainer" containerID="c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.048473 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5\": container with ID starting with c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5 not found: ID does not exist" containerID="c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048496 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5"} err="failed to get container status \"c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5\": rpc error: code = NotFound desc = could not find container \"c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5\": container with ID starting with c936187a631b7608ba43a884cd5ad009fdd0bf4bc3425f74f8169b20e446b0f5 not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048510 4694 scope.go:117] "RemoveContainer" containerID="2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.048816 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c\": container with ID starting with 2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c not found: ID does not exist" containerID="2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048845 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c"} err="failed to get container status \"2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c\": rpc error: code = NotFound desc = could not find container \"2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c\": container with ID starting with 2956dcb64977830c66de1d5a6cf9b6b81f1c059429a298c1e466b21815a4960c not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.048862 4694 scope.go:117] "RemoveContainer" containerID="8f31f6ffd8199d154ac7758f7aae96247573741a30c18f79c5ea2e749c6d21e6" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.070637 4694 scope.go:117] "RemoveContainer" containerID="f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.086260 4694 scope.go:117] "RemoveContainer" containerID="122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.099634 4694 scope.go:117] "RemoveContainer" containerID="a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.110244 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.115410 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xsqk6"] Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.117432 4694 scope.go:117] "RemoveContainer" containerID="f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.117826 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9\": container with ID starting with f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9 not found: ID does not exist" containerID="f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.117867 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9"} err="failed to get container status \"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9\": rpc error: code = NotFound desc = could not find container \"f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9\": container with ID starting with f373bbdf9959f34d3d6f0519deb355598019b36e87b9d4eadb8dc8b67a5d6ce9 not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.117897 4694 scope.go:117] "RemoveContainer" containerID="122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.118453 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c\": container with ID starting with 122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c not found: ID does not exist" containerID="122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.118528 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c"} err="failed to get container status \"122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c\": rpc error: code = NotFound desc = could not find container \"122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c\": container with ID starting with 122d02a5f5453d3a5a7c8ed202d831cb2c2c36cff1b91874b6f8ed79884a5b7c not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.118561 4694 scope.go:117] "RemoveContainer" containerID="a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.119092 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b\": container with ID starting with a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b not found: ID does not exist" containerID="a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.119116 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b"} err="failed to get container status \"a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b\": rpc error: code = NotFound desc = could not find container \"a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b\": container with ID starting with a2c660c2b6e26a8479f081f6a00e2cb58133e3a9999c9079790ebee583e8796b not found: ID does not exist" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.239072 4694 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t8q2l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.239131 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t8q2l" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.824502 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" event={"ID":"80752637-17b7-451f-a4f9-c15ff9d5bd47","Type":"ContainerStarted","Data":"7929a01ee644d0c4a35b181f87d1b6b3aa744fb3a4f78bc754cc89f9c804c99c"} Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.824558 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" event={"ID":"80752637-17b7-451f-a4f9-c15ff9d5bd47","Type":"ContainerStarted","Data":"23edf7effa9fc8ef047f2003af2c35430683b9389978dc477fb21cdce5090fc1"} Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.825178 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.829625 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.843332 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w82n9" podStartSLOduration=2.843314048 podStartE2EDuration="2.843314048s" podCreationTimestamp="2026-02-17 16:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:48:46.841437644 +0000 UTC m=+394.598513008" watchObservedRunningTime="2026-02-17 16:48:46.843314048 +0000 UTC m=+394.600389372" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.903410 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" path="/var/lib/kubelet/pods/0f68e586-955c-4c2c-8b3e-a91f6b95a442/volumes" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.904402 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" path="/var/lib/kubelet/pods/4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c/volumes" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.904948 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" path="/var/lib/kubelet/pods/7ff751c3-1af4-4a0d-b057-302ecb2d7bd4/volumes" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.905962 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" path="/var/lib/kubelet/pods/c7a9bea3-8150-4246-9c2b-dd9d57e17f30/volumes" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.906764 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" path="/var/lib/kubelet/pods/f2002375-3db0-44d4-8c8d-e945a20a38d9/volumes" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.942986 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5wb"] Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943176 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943186 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943196 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943202 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943210 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943216 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943225 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943230 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943238 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943245 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943256 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943262 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943269 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943276 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943283 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943289 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="extract-utilities" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943297 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943302 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943309 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943315 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943322 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943328 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943335 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943341 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943350 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943355 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: E0217 16:48:46.943364 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943370 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="extract-content" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943457 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2002375-3db0-44d4-8c8d-e945a20a38d9" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943472 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f68e586-955c-4c2c-8b3e-a91f6b95a442" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943479 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a9bea3-8150-4246-9c2b-dd9d57e17f30" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943486 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943493 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfbb34b-e1d4-47f7-afbf-7cedf8805c4c" containerName="marketplace-operator" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.943504 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff751c3-1af4-4a0d-b057-302ecb2d7bd4" containerName="registry-server" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.944216 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.951328 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 16:48:46 crc kubenswrapper[4694]: I0217 16:48:46.953971 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5wb"] Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.082084 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-catalog-content\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.082137 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7blp\" (UniqueName: \"kubernetes.io/projected/64364749-c028-459b-8099-dd62cae9a8a1-kube-api-access-l7blp\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.082428 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-utilities\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.147857 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hxhhr"] Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.156472 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.158663 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.160346 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxhhr"] Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.183798 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-catalog-content\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.183831 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7blp\" (UniqueName: \"kubernetes.io/projected/64364749-c028-459b-8099-dd62cae9a8a1-kube-api-access-l7blp\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.183876 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-utilities\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.184255 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-utilities\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.184458 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64364749-c028-459b-8099-dd62cae9a8a1-catalog-content\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.204245 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7blp\" (UniqueName: \"kubernetes.io/projected/64364749-c028-459b-8099-dd62cae9a8a1-kube-api-access-l7blp\") pod \"redhat-marketplace-8g5wb\" (UID: \"64364749-c028-459b-8099-dd62cae9a8a1\") " pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.265063 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.285000 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-utilities\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.285212 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9g6\" (UniqueName: \"kubernetes.io/projected/0bc3455e-a064-46d4-9504-b6347f5508d5-kube-api-access-zb9g6\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.285355 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-catalog-content\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.386567 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9g6\" (UniqueName: \"kubernetes.io/projected/0bc3455e-a064-46d4-9504-b6347f5508d5-kube-api-access-zb9g6\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.387148 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-catalog-content\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.387271 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-utilities\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.388257 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-utilities\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.389516 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bc3455e-a064-46d4-9504-b6347f5508d5-catalog-content\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.415712 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9g6\" (UniqueName: \"kubernetes.io/projected/0bc3455e-a064-46d4-9504-b6347f5508d5-kube-api-access-zb9g6\") pod \"redhat-operators-hxhhr\" (UID: \"0bc3455e-a064-46d4-9504-b6347f5508d5\") " pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.496418 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.712728 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8g5wb"] Feb 17 16:48:47 crc kubenswrapper[4694]: W0217 16:48:47.716632 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64364749_c028_459b_8099_dd62cae9a8a1.slice/crio-6012bbc7280e7d0fd8d0f9d93110e4d2b13bee9ddaf75a89868b84b2ae86f420 WatchSource:0}: Error finding container 6012bbc7280e7d0fd8d0f9d93110e4d2b13bee9ddaf75a89868b84b2ae86f420: Status 404 returned error can't find the container with id 6012bbc7280e7d0fd8d0f9d93110e4d2b13bee9ddaf75a89868b84b2ae86f420 Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.831229 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5wb" event={"ID":"64364749-c028-459b-8099-dd62cae9a8a1","Type":"ContainerStarted","Data":"6012bbc7280e7d0fd8d0f9d93110e4d2b13bee9ddaf75a89868b84b2ae86f420"} Feb 17 16:48:47 crc kubenswrapper[4694]: I0217 16:48:47.902659 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxhhr"] Feb 17 16:48:47 crc kubenswrapper[4694]: W0217 16:48:47.936318 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc3455e_a064_46d4_9504_b6347f5508d5.slice/crio-3151efe2672727cc2b4d743d5e60e447a71a25b3ec55ff64dc6a5165e0e8f263 WatchSource:0}: Error finding container 3151efe2672727cc2b4d743d5e60e447a71a25b3ec55ff64dc6a5165e0e8f263: Status 404 returned error can't find the container with id 3151efe2672727cc2b4d743d5e60e447a71a25b3ec55ff64dc6a5165e0e8f263 Feb 17 16:48:48 crc kubenswrapper[4694]: I0217 16:48:48.838733 4694 generic.go:334] "Generic (PLEG): container finished" podID="64364749-c028-459b-8099-dd62cae9a8a1" containerID="dd6f3029049369a41babb8f7be03840d346929e6b91a1eb22e572595a041a6e2" exitCode=0 Feb 17 16:48:48 crc kubenswrapper[4694]: I0217 16:48:48.838847 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5wb" event={"ID":"64364749-c028-459b-8099-dd62cae9a8a1","Type":"ContainerDied","Data":"dd6f3029049369a41babb8f7be03840d346929e6b91a1eb22e572595a041a6e2"} Feb 17 16:48:48 crc kubenswrapper[4694]: I0217 16:48:48.840783 4694 generic.go:334] "Generic (PLEG): container finished" podID="0bc3455e-a064-46d4-9504-b6347f5508d5" containerID="659fddb84f3c14ef34910048e292669535ec6e8c2bc5f850d4ff87161caf79dd" exitCode=0 Feb 17 16:48:48 crc kubenswrapper[4694]: I0217 16:48:48.840860 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxhhr" event={"ID":"0bc3455e-a064-46d4-9504-b6347f5508d5","Type":"ContainerDied","Data":"659fddb84f3c14ef34910048e292669535ec6e8c2bc5f850d4ff87161caf79dd"} Feb 17 16:48:48 crc kubenswrapper[4694]: I0217 16:48:48.840893 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxhhr" event={"ID":"0bc3455e-a064-46d4-9504-b6347f5508d5","Type":"ContainerStarted","Data":"3151efe2672727cc2b4d743d5e60e447a71a25b3ec55ff64dc6a5165e0e8f263"} Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.350129 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hznv8"] Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.352026 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.353981 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.359215 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hznv8"] Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.513348 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-catalog-content\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.513837 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-utilities\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.513914 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhgqd\" (UniqueName: \"kubernetes.io/projected/737fdc2f-4b41-4c22-bae3-2411c82d16af-kube-api-access-dhgqd\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.556001 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pb62z"] Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.558333 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.560701 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.561089 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb62z"] Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.615388 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-utilities\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.615451 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhgqd\" (UniqueName: \"kubernetes.io/projected/737fdc2f-4b41-4c22-bae3-2411c82d16af-kube-api-access-dhgqd\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.615543 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-catalog-content\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.615976 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-utilities\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.615993 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737fdc2f-4b41-4c22-bae3-2411c82d16af-catalog-content\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.638806 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhgqd\" (UniqueName: \"kubernetes.io/projected/737fdc2f-4b41-4c22-bae3-2411c82d16af-kube-api-access-dhgqd\") pod \"certified-operators-hznv8\" (UID: \"737fdc2f-4b41-4c22-bae3-2411c82d16af\") " pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.678146 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.716302 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f462\" (UniqueName: \"kubernetes.io/projected/35ca4ecd-4b3f-45f9-b620-a945b332d711-kube-api-access-4f462\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.716369 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-catalog-content\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.716450 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-utilities\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.818723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f462\" (UniqueName: \"kubernetes.io/projected/35ca4ecd-4b3f-45f9-b620-a945b332d711-kube-api-access-4f462\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.819204 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-catalog-content\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.819310 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-utilities\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.820152 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-catalog-content\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.820289 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35ca4ecd-4b3f-45f9-b620-a945b332d711-utilities\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.836869 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f462\" (UniqueName: \"kubernetes.io/projected/35ca4ecd-4b3f-45f9-b620-a945b332d711-kube-api-access-4f462\") pod \"community-operators-pb62z\" (UID: \"35ca4ecd-4b3f-45f9-b620-a945b332d711\") " pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.859387 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxhhr" event={"ID":"0bc3455e-a064-46d4-9504-b6347f5508d5","Type":"ContainerStarted","Data":"465b0759020ae59797f3ac4e7b14ef0f6f37975fbe792423c5995387e2992c45"} Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.862996 4694 generic.go:334] "Generic (PLEG): container finished" podID="64364749-c028-459b-8099-dd62cae9a8a1" containerID="f4faf8421a37fe4ecc9c788875eb6f4fcdcf24c7d536abf2da8a680568249580" exitCode=0 Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.863044 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5wb" event={"ID":"64364749-c028-459b-8099-dd62cae9a8a1","Type":"ContainerDied","Data":"f4faf8421a37fe4ecc9c788875eb6f4fcdcf24c7d536abf2da8a680568249580"} Feb 17 16:48:49 crc kubenswrapper[4694]: I0217 16:48:49.880173 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.120230 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hznv8"] Feb 17 16:48:50 crc kubenswrapper[4694]: W0217 16:48:50.127263 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737fdc2f_4b41_4c22_bae3_2411c82d16af.slice/crio-fcb037ff7e0d8cfa4432f58c41e92d9672fa44efaae70ab7b9684a34f8412cbb WatchSource:0}: Error finding container fcb037ff7e0d8cfa4432f58c41e92d9672fa44efaae70ab7b9684a34f8412cbb: Status 404 returned error can't find the container with id fcb037ff7e0d8cfa4432f58c41e92d9672fa44efaae70ab7b9684a34f8412cbb Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.286166 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pb62z"] Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.875751 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8g5wb" event={"ID":"64364749-c028-459b-8099-dd62cae9a8a1","Type":"ContainerStarted","Data":"28b2f58def0e7ba8189cad37b8b4c887d21ce715204c613099f4d3782d5109fa"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.878380 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxhhr" event={"ID":"0bc3455e-a064-46d4-9504-b6347f5508d5","Type":"ContainerDied","Data":"465b0759020ae59797f3ac4e7b14ef0f6f37975fbe792423c5995387e2992c45"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.878694 4694 generic.go:334] "Generic (PLEG): container finished" podID="0bc3455e-a064-46d4-9504-b6347f5508d5" containerID="465b0759020ae59797f3ac4e7b14ef0f6f37975fbe792423c5995387e2992c45" exitCode=0 Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.885020 4694 generic.go:334] "Generic (PLEG): container finished" podID="737fdc2f-4b41-4c22-bae3-2411c82d16af" containerID="7a9d8287034b50a65e0db8b16d306abf65cf1307883dedd084c8959f71df4fe2" exitCode=0 Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.885095 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hznv8" event={"ID":"737fdc2f-4b41-4c22-bae3-2411c82d16af","Type":"ContainerDied","Data":"7a9d8287034b50a65e0db8b16d306abf65cf1307883dedd084c8959f71df4fe2"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.885121 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hznv8" event={"ID":"737fdc2f-4b41-4c22-bae3-2411c82d16af","Type":"ContainerStarted","Data":"fcb037ff7e0d8cfa4432f58c41e92d9672fa44efaae70ab7b9684a34f8412cbb"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.888642 4694 generic.go:334] "Generic (PLEG): container finished" podID="35ca4ecd-4b3f-45f9-b620-a945b332d711" containerID="967a89fb9461f8b289f1c7367905c54ebd73a9beeeffc000be06a06398efbdf6" exitCode=0 Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.888693 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb62z" event={"ID":"35ca4ecd-4b3f-45f9-b620-a945b332d711","Type":"ContainerDied","Data":"967a89fb9461f8b289f1c7367905c54ebd73a9beeeffc000be06a06398efbdf6"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.888725 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb62z" event={"ID":"35ca4ecd-4b3f-45f9-b620-a945b332d711","Type":"ContainerStarted","Data":"0535e3750941808c3716f827779903a5637fe2795abd778eacbd4aff15d4e4eb"} Feb 17 16:48:50 crc kubenswrapper[4694]: I0217 16:48:50.903976 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8g5wb" podStartSLOduration=3.412868873 podStartE2EDuration="4.90395817s" podCreationTimestamp="2026-02-17 16:48:46 +0000 UTC" firstStartedPulling="2026-02-17 16:48:48.840228694 +0000 UTC m=+396.597304018" lastFinishedPulling="2026-02-17 16:48:50.331317991 +0000 UTC m=+398.088393315" observedRunningTime="2026-02-17 16:48:50.900521751 +0000 UTC m=+398.657597075" watchObservedRunningTime="2026-02-17 16:48:50.90395817 +0000 UTC m=+398.661033494" Feb 17 16:48:51 crc kubenswrapper[4694]: I0217 16:48:51.895376 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxhhr" event={"ID":"0bc3455e-a064-46d4-9504-b6347f5508d5","Type":"ContainerStarted","Data":"2002566e34e70cfeb18eab92851df9bbdbd1a7d38e07d91f731f02f53b4010f3"} Feb 17 16:48:51 crc kubenswrapper[4694]: I0217 16:48:51.897331 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hznv8" event={"ID":"737fdc2f-4b41-4c22-bae3-2411c82d16af","Type":"ContainerStarted","Data":"49fc89fb8f4dbfdb9121a49b77e84dcf809eb0a2c3c190d8790fe39b3d9de195"} Feb 17 16:48:51 crc kubenswrapper[4694]: I0217 16:48:51.899160 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb62z" event={"ID":"35ca4ecd-4b3f-45f9-b620-a945b332d711","Type":"ContainerStarted","Data":"d883ce3123b3982754bf2b6a1a21789e66936a11c5442c7e5a8af7317256582a"} Feb 17 16:48:51 crc kubenswrapper[4694]: I0217 16:48:51.914821 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hxhhr" podStartSLOduration=2.447262217 podStartE2EDuration="4.914803245s" podCreationTimestamp="2026-02-17 16:48:47 +0000 UTC" firstStartedPulling="2026-02-17 16:48:48.842349443 +0000 UTC m=+396.599424767" lastFinishedPulling="2026-02-17 16:48:51.309890471 +0000 UTC m=+399.066965795" observedRunningTime="2026-02-17 16:48:51.911847177 +0000 UTC m=+399.668922501" watchObservedRunningTime="2026-02-17 16:48:51.914803245 +0000 UTC m=+399.671878569" Feb 17 16:48:52 crc kubenswrapper[4694]: I0217 16:48:52.924748 4694 generic.go:334] "Generic (PLEG): container finished" podID="737fdc2f-4b41-4c22-bae3-2411c82d16af" containerID="49fc89fb8f4dbfdb9121a49b77e84dcf809eb0a2c3c190d8790fe39b3d9de195" exitCode=0 Feb 17 16:48:52 crc kubenswrapper[4694]: I0217 16:48:52.924871 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hznv8" event={"ID":"737fdc2f-4b41-4c22-bae3-2411c82d16af","Type":"ContainerDied","Data":"49fc89fb8f4dbfdb9121a49b77e84dcf809eb0a2c3c190d8790fe39b3d9de195"} Feb 17 16:48:52 crc kubenswrapper[4694]: I0217 16:48:52.930934 4694 generic.go:334] "Generic (PLEG): container finished" podID="35ca4ecd-4b3f-45f9-b620-a945b332d711" containerID="d883ce3123b3982754bf2b6a1a21789e66936a11c5442c7e5a8af7317256582a" exitCode=0 Feb 17 16:48:52 crc kubenswrapper[4694]: I0217 16:48:52.931004 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb62z" event={"ID":"35ca4ecd-4b3f-45f9-b620-a945b332d711","Type":"ContainerDied","Data":"d883ce3123b3982754bf2b6a1a21789e66936a11c5442c7e5a8af7317256582a"} Feb 17 16:48:53 crc kubenswrapper[4694]: I0217 16:48:53.937832 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pb62z" event={"ID":"35ca4ecd-4b3f-45f9-b620-a945b332d711","Type":"ContainerStarted","Data":"af6f3dc29c3e9531ced7fb1068b0d2451a67188dfb48406a773c2e557ff42e89"} Feb 17 16:48:53 crc kubenswrapper[4694]: I0217 16:48:53.941904 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hznv8" event={"ID":"737fdc2f-4b41-4c22-bae3-2411c82d16af","Type":"ContainerStarted","Data":"0e1c5c4eec6c072c06beff1f5eec7a8a39e37383715ff2bd4e24a8c64cc6165a"} Feb 17 16:48:53 crc kubenswrapper[4694]: I0217 16:48:53.961626 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pb62z" podStartSLOduration=2.385870821 podStartE2EDuration="4.961594751s" podCreationTimestamp="2026-02-17 16:48:49 +0000 UTC" firstStartedPulling="2026-02-17 16:48:50.892947847 +0000 UTC m=+398.650023171" lastFinishedPulling="2026-02-17 16:48:53.468671777 +0000 UTC m=+401.225747101" observedRunningTime="2026-02-17 16:48:53.95893999 +0000 UTC m=+401.716015314" watchObservedRunningTime="2026-02-17 16:48:53.961594751 +0000 UTC m=+401.718670075" Feb 17 16:48:53 crc kubenswrapper[4694]: I0217 16:48:53.982842 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hznv8" podStartSLOduration=2.346970314 podStartE2EDuration="4.98282618s" podCreationTimestamp="2026-02-17 16:48:49 +0000 UTC" firstStartedPulling="2026-02-17 16:48:50.89353138 +0000 UTC m=+398.650606704" lastFinishedPulling="2026-02-17 16:48:53.529387246 +0000 UTC m=+401.286462570" observedRunningTime="2026-02-17 16:48:53.981189203 +0000 UTC m=+401.738264527" watchObservedRunningTime="2026-02-17 16:48:53.98282618 +0000 UTC m=+401.739901504" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.265318 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.269602 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.325528 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.438881 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" podUID="a3388f49-0f84-40f2-8030-a6f508979e71" containerName="registry" containerID="cri-o://220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d" gracePeriod=30 Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.497597 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.497670 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.543954 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.879879 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937050 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtlf\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937259 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937289 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937329 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937357 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937385 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937433 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.937469 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted\") pod \"a3388f49-0f84-40f2-8030-a6f508979e71\" (UID: \"a3388f49-0f84-40f2-8030-a6f508979e71\") " Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.938088 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.938276 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.943165 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.943676 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.943775 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf" (OuterVolumeSpecName: "kube-api-access-lhtlf") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "kube-api-access-lhtlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.944094 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.948573 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.953893 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a3388f49-0f84-40f2-8030-a6f508979e71" (UID: "a3388f49-0f84-40f2-8030-a6f508979e71"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.965546 4694 generic.go:334] "Generic (PLEG): container finished" podID="a3388f49-0f84-40f2-8030-a6f508979e71" containerID="220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d" exitCode=0 Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.965871 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" event={"ID":"a3388f49-0f84-40f2-8030-a6f508979e71","Type":"ContainerDied","Data":"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d"} Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.965907 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" event={"ID":"a3388f49-0f84-40f2-8030-a6f508979e71","Type":"ContainerDied","Data":"f2e9d6ec322f83cee67142578ace1145088e9adcb69d57566c632f7ea2fe38af"} Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.965926 4694 scope.go:117] "RemoveContainer" containerID="220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d" Feb 17 16:48:57 crc kubenswrapper[4694]: I0217 16:48:57.965965 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6d7zh" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.002954 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.003758 4694 scope.go:117] "RemoveContainer" containerID="220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.004376 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6d7zh"] Feb 17 16:48:58 crc kubenswrapper[4694]: E0217 16:48:58.004420 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d\": container with ID starting with 220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d not found: ID does not exist" containerID="220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.004458 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d"} err="failed to get container status \"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d\": rpc error: code = NotFound desc = could not find container \"220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d\": container with ID starting with 220184727cd67debf1c83c738179ff6759dde92a72c4291be2e5f7c6e918264d not found: ID does not exist" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.010120 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hxhhr" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.010744 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8g5wb" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.038396 4694 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039129 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039152 4694 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039162 4694 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a3388f49-0f84-40f2-8030-a6f508979e71-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039172 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtlf\" (UniqueName: \"kubernetes.io/projected/a3388f49-0f84-40f2-8030-a6f508979e71-kube-api-access-lhtlf\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039180 4694 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a3388f49-0f84-40f2-8030-a6f508979e71-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.039189 4694 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a3388f49-0f84-40f2-8030-a6f508979e71-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 16:48:58 crc kubenswrapper[4694]: I0217 16:48:58.905082 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3388f49-0f84-40f2-8030-a6f508979e71" path="/var/lib/kubelet/pods/a3388f49-0f84-40f2-8030-a6f508979e71/volumes" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.678819 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.678890 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.728046 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.880998 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.881286 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:48:59 crc kubenswrapper[4694]: I0217 16:48:59.936847 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:49:00 crc kubenswrapper[4694]: I0217 16:49:00.024080 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hznv8" Feb 17 16:49:00 crc kubenswrapper[4694]: I0217 16:49:00.034806 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pb62z" Feb 17 16:51:14 crc kubenswrapper[4694]: I0217 16:51:14.618127 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:51:14 crc kubenswrapper[4694]: I0217 16:51:14.618820 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:51:44 crc kubenswrapper[4694]: I0217 16:51:44.618425 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:51:44 crc kubenswrapper[4694]: I0217 16:51:44.619262 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:52:14 crc kubenswrapper[4694]: I0217 16:52:14.617770 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:52:14 crc kubenswrapper[4694]: I0217 16:52:14.618289 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:52:14 crc kubenswrapper[4694]: I0217 16:52:14.618328 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:52:14 crc kubenswrapper[4694]: I0217 16:52:14.618925 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:52:14 crc kubenswrapper[4694]: I0217 16:52:14.618982 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595" gracePeriod=600 Feb 17 16:52:15 crc kubenswrapper[4694]: I0217 16:52:15.119916 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595" exitCode=0 Feb 17 16:52:15 crc kubenswrapper[4694]: I0217 16:52:15.119979 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595"} Feb 17 16:52:15 crc kubenswrapper[4694]: I0217 16:52:15.120316 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3"} Feb 17 16:52:15 crc kubenswrapper[4694]: I0217 16:52:15.120349 4694 scope.go:117] "RemoveContainer" containerID="963ecf435fb681d4097c1e2e11de629281374ce880fdb6edbb191e877f7901e8" Feb 17 16:54:14 crc kubenswrapper[4694]: I0217 16:54:14.618076 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:54:14 crc kubenswrapper[4694]: I0217 16:54:14.618838 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:54:44 crc kubenswrapper[4694]: I0217 16:54:44.618417 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:54:44 crc kubenswrapper[4694]: I0217 16:54:44.619313 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:54:46 crc kubenswrapper[4694]: I0217 16:54:46.883259 4694 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 16:55:14 crc kubenswrapper[4694]: I0217 16:55:14.618176 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:55:14 crc kubenswrapper[4694]: I0217 16:55:14.618864 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:55:14 crc kubenswrapper[4694]: I0217 16:55:14.618941 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:55:14 crc kubenswrapper[4694]: I0217 16:55:14.619864 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:55:14 crc kubenswrapper[4694]: I0217 16:55:14.619989 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3" gracePeriod=600 Feb 17 16:55:15 crc kubenswrapper[4694]: I0217 16:55:15.178555 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3" exitCode=0 Feb 17 16:55:15 crc kubenswrapper[4694]: I0217 16:55:15.178665 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3"} Feb 17 16:55:15 crc kubenswrapper[4694]: I0217 16:55:15.179156 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107"} Feb 17 16:55:15 crc kubenswrapper[4694]: I0217 16:55:15.179184 4694 scope.go:117] "RemoveContainer" containerID="96aab37567d8c8b776aebabf47ba93d557454ae172096341f6f115f7ff5ec595" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.887863 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-45nhc"] Feb 17 16:55:31 crc kubenswrapper[4694]: E0217 16:55:31.888552 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3388f49-0f84-40f2-8030-a6f508979e71" containerName="registry" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.888566 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3388f49-0f84-40f2-8030-a6f508979e71" containerName="registry" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.888702 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3388f49-0f84-40f2-8030-a6f508979e71" containerName="registry" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.889118 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.891999 4694 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hg49b" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.892178 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.892279 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.899953 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-k7pjr"] Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.903850 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k7pjr" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.905939 4694 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bnfcn" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.916151 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k7pjr"] Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.932594 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c29mn"] Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.933261 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.937919 4694 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6hdxr" Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.955664 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c29mn"] Feb 17 16:55:31 crc kubenswrapper[4694]: I0217 16:55:31.961419 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-45nhc"] Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.034422 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwlh\" (UniqueName: \"kubernetes.io/projected/da727533-6298-432b-9048-f78ff8faad8f-kube-api-access-ptwlh\") pod \"cert-manager-webhook-687f57d79b-c29mn\" (UID: \"da727533-6298-432b-9048-f78ff8faad8f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.034496 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzzk\" (UniqueName: \"kubernetes.io/projected/56fc387f-0f40-4de1-b4f5-628ecdecc25b-kube-api-access-lfzzk\") pod \"cert-manager-cainjector-cf98fcc89-45nhc\" (UID: \"56fc387f-0f40-4de1-b4f5-628ecdecc25b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.034546 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64zt2\" (UniqueName: \"kubernetes.io/projected/a4ed5c6f-777c-4e48-acc5-335e03efbe15-kube-api-access-64zt2\") pod \"cert-manager-858654f9db-k7pjr\" (UID: \"a4ed5c6f-777c-4e48-acc5-335e03efbe15\") " pod="cert-manager/cert-manager-858654f9db-k7pjr" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.135715 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64zt2\" (UniqueName: \"kubernetes.io/projected/a4ed5c6f-777c-4e48-acc5-335e03efbe15-kube-api-access-64zt2\") pod \"cert-manager-858654f9db-k7pjr\" (UID: \"a4ed5c6f-777c-4e48-acc5-335e03efbe15\") " pod="cert-manager/cert-manager-858654f9db-k7pjr" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.135872 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwlh\" (UniqueName: \"kubernetes.io/projected/da727533-6298-432b-9048-f78ff8faad8f-kube-api-access-ptwlh\") pod \"cert-manager-webhook-687f57d79b-c29mn\" (UID: \"da727533-6298-432b-9048-f78ff8faad8f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.135955 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzzk\" (UniqueName: \"kubernetes.io/projected/56fc387f-0f40-4de1-b4f5-628ecdecc25b-kube-api-access-lfzzk\") pod \"cert-manager-cainjector-cf98fcc89-45nhc\" (UID: \"56fc387f-0f40-4de1-b4f5-628ecdecc25b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.154545 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64zt2\" (UniqueName: \"kubernetes.io/projected/a4ed5c6f-777c-4e48-acc5-335e03efbe15-kube-api-access-64zt2\") pod \"cert-manager-858654f9db-k7pjr\" (UID: \"a4ed5c6f-777c-4e48-acc5-335e03efbe15\") " pod="cert-manager/cert-manager-858654f9db-k7pjr" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.160769 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwlh\" (UniqueName: \"kubernetes.io/projected/da727533-6298-432b-9048-f78ff8faad8f-kube-api-access-ptwlh\") pod \"cert-manager-webhook-687f57d79b-c29mn\" (UID: \"da727533-6298-432b-9048-f78ff8faad8f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.163163 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzzk\" (UniqueName: \"kubernetes.io/projected/56fc387f-0f40-4de1-b4f5-628ecdecc25b-kube-api-access-lfzzk\") pod \"cert-manager-cainjector-cf98fcc89-45nhc\" (UID: \"56fc387f-0f40-4de1-b4f5-628ecdecc25b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.215838 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.239657 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-k7pjr" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.248874 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.474915 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-k7pjr"] Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.482657 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.653866 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-45nhc"] Feb 17 16:55:32 crc kubenswrapper[4694]: I0217 16:55:32.728657 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-c29mn"] Feb 17 16:55:32 crc kubenswrapper[4694]: W0217 16:55:32.730002 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda727533_6298_432b_9048_f78ff8faad8f.slice/crio-8b3bb624e1f14cd2bcffc3d60c502190e425ac3f1f48fec0ac63775a44b59747 WatchSource:0}: Error finding container 8b3bb624e1f14cd2bcffc3d60c502190e425ac3f1f48fec0ac63775a44b59747: Status 404 returned error can't find the container with id 8b3bb624e1f14cd2bcffc3d60c502190e425ac3f1f48fec0ac63775a44b59747 Feb 17 16:55:33 crc kubenswrapper[4694]: I0217 16:55:33.307366 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" event={"ID":"da727533-6298-432b-9048-f78ff8faad8f","Type":"ContainerStarted","Data":"8b3bb624e1f14cd2bcffc3d60c502190e425ac3f1f48fec0ac63775a44b59747"} Feb 17 16:55:33 crc kubenswrapper[4694]: I0217 16:55:33.309513 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" event={"ID":"56fc387f-0f40-4de1-b4f5-628ecdecc25b","Type":"ContainerStarted","Data":"1c4142bbfd657a262155b35f39a2681a1a9e4ce0ac5ee249a3d956f24ddd1a25"} Feb 17 16:55:33 crc kubenswrapper[4694]: I0217 16:55:33.310571 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k7pjr" event={"ID":"a4ed5c6f-777c-4e48-acc5-335e03efbe15","Type":"ContainerStarted","Data":"419fb3d726e1ce3672fad417856059180aa414a7e60551809164b8ba0c45d1bc"} Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.336164 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" event={"ID":"56fc387f-0f40-4de1-b4f5-628ecdecc25b","Type":"ContainerStarted","Data":"817c2d39951bd2437809c867fafcccb4690f9f1f2a08c05fb69c22cf49f4a673"} Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.338737 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-k7pjr" event={"ID":"a4ed5c6f-777c-4e48-acc5-335e03efbe15","Type":"ContainerStarted","Data":"6cb45cd9fbeed254c897d25d8f69930a77494bb879dc98ddbbe1616007a24b48"} Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.340333 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" event={"ID":"da727533-6298-432b-9048-f78ff8faad8f","Type":"ContainerStarted","Data":"f6565002aff24937a8e43954dd458fafde28cb3dad286a6ffdd1c729d3d82feb"} Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.340552 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.350886 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-45nhc" podStartSLOduration=2.10321021 podStartE2EDuration="6.350865615s" podCreationTimestamp="2026-02-17 16:55:31 +0000 UTC" firstStartedPulling="2026-02-17 16:55:32.660729817 +0000 UTC m=+800.417805161" lastFinishedPulling="2026-02-17 16:55:36.908385252 +0000 UTC m=+804.665460566" observedRunningTime="2026-02-17 16:55:37.347305058 +0000 UTC m=+805.104380402" watchObservedRunningTime="2026-02-17 16:55:37.350865615 +0000 UTC m=+805.107940939" Feb 17 16:55:37 crc kubenswrapper[4694]: I0217 16:55:37.375194 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-k7pjr" podStartSLOduration=2.1524936 podStartE2EDuration="6.375161232s" podCreationTimestamp="2026-02-17 16:55:31 +0000 UTC" firstStartedPulling="2026-02-17 16:55:32.482395828 +0000 UTC m=+800.239471152" lastFinishedPulling="2026-02-17 16:55:36.70506347 +0000 UTC m=+804.462138784" observedRunningTime="2026-02-17 16:55:37.366452148 +0000 UTC m=+805.123527472" watchObservedRunningTime="2026-02-17 16:55:37.375161232 +0000 UTC m=+805.132236606" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.359897 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" podStartSLOduration=4.248467239 podStartE2EDuration="8.359880159s" podCreationTimestamp="2026-02-17 16:55:31 +0000 UTC" firstStartedPulling="2026-02-17 16:55:32.731885454 +0000 UTC m=+800.488960788" lastFinishedPulling="2026-02-17 16:55:36.843298384 +0000 UTC m=+804.600373708" observedRunningTime="2026-02-17 16:55:37.403552449 +0000 UTC m=+805.160627813" watchObservedRunningTime="2026-02-17 16:55:39.359880159 +0000 UTC m=+807.116955493" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.367653 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.369291 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.433804 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.535875 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.536021 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.536069 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wkfc\" (UniqueName: \"kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.637186 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.637246 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.637265 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wkfc\" (UniqueName: \"kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.637717 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.637810 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.657398 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wkfc\" (UniqueName: \"kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc\") pod \"redhat-operators-vq5rk\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:39 crc kubenswrapper[4694]: I0217 16:55:39.690133 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:40 crc kubenswrapper[4694]: I0217 16:55:40.109489 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:55:40 crc kubenswrapper[4694]: W0217 16:55:40.117089 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe187bfc_0eab_4648_95da_0d2e0100b6cb.slice/crio-009a1318d639947860c36be0a724915a6cd09dac0dff226490c478ab1c9b1a91 WatchSource:0}: Error finding container 009a1318d639947860c36be0a724915a6cd09dac0dff226490c478ab1c9b1a91: Status 404 returned error can't find the container with id 009a1318d639947860c36be0a724915a6cd09dac0dff226490c478ab1c9b1a91 Feb 17 16:55:40 crc kubenswrapper[4694]: I0217 16:55:40.366280 4694 generic.go:334] "Generic (PLEG): container finished" podID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerID="af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712" exitCode=0 Feb 17 16:55:40 crc kubenswrapper[4694]: I0217 16:55:40.366316 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerDied","Data":"af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712"} Feb 17 16:55:40 crc kubenswrapper[4694]: I0217 16:55:40.366357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerStarted","Data":"009a1318d639947860c36be0a724915a6cd09dac0dff226490c478ab1c9b1a91"} Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.374045 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerStarted","Data":"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03"} Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.715791 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fjpm"] Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716211 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-controller" containerID="cri-o://3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716261 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="sbdb" containerID="cri-o://ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716339 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="nbdb" containerID="cri-o://46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716382 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="northd" containerID="cri-o://02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716419 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716457 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-node" containerID="cri-o://2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.716493 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-acl-logging" containerID="cri-o://12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" gracePeriod=30 Feb 17 16:55:41 crc kubenswrapper[4694]: I0217 16:55:41.756317 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" containerID="cri-o://75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" gracePeriod=30 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.098398 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/3.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.100320 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovn-acl-logging/0.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.100774 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovn-controller/0.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.101345 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.181671 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqvfj"] Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.181950 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kubecfg-setup" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.181966 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kubecfg-setup" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.181978 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="nbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.181986 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="nbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.181996 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="sbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182004 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="sbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182018 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="northd" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182026 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="northd" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182037 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182046 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182093 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182102 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182112 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182120 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182130 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182137 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182174 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182182 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182194 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-node" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182202 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-node" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182212 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-acl-logging" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182219 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-acl-logging" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.182256 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182265 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182452 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="nbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182468 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-node" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182505 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182515 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182525 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182534 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182544 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="northd" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182577 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182591 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="sbdb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.182647 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovn-acl-logging" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.184326 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.184378 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.185443 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.185483 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerName="ovnkube-controller" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.195122 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.253301 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-c29mn" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.296936 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.296992 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297021 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297046 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297124 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297173 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log" (OuterVolumeSpecName: "node-log") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297203 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297225 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297335 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpm9k\" (UniqueName: \"kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297798 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297834 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297855 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297875 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297909 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297936 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297954 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297973 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.297988 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298011 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298028 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298048 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298071 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298091 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298112 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch\") pod \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\" (UID: \"d15f1d18-d80a-4fc0-a710-a95c74465b6e\") " Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298208 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-systemd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298232 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-slash\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298252 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-node-log\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298272 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-netns\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298290 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-netd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298314 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkq5c\" (UniqueName: \"kubernetes.io/projected/4a088de9-a5d7-48d7-ac21-039f1dbc6358-kube-api-access-lkq5c\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298339 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-env-overrides\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298373 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-bin\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298397 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298417 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-script-lib\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298440 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298458 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-log-socket\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298480 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-etc-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298501 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-var-lib-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298527 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298549 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-kubelet\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298599 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovn-node-metrics-cert\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298680 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-systemd-units\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298703 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-config\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298738 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket" (OuterVolumeSpecName: "log-socket") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298813 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298881 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.298933 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299327 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299492 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-ovn\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299887 4694 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299913 4694 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299383 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299458 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299523 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash" (OuterVolumeSpecName: "host-slash") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299540 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299549 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299577 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300009 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300042 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.299931 4694 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300128 4694 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300151 4694 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300195 4694 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300213 4694 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300235 4694 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.300274 4694 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.305553 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k" (OuterVolumeSpecName: "kube-api-access-hpm9k") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "kube-api-access-hpm9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.306090 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.323661 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d15f1d18-d80a-4fc0-a710-a95c74465b6e" (UID: "d15f1d18-d80a-4fc0-a710-a95c74465b6e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.382093 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/2.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.383093 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/1.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.383201 4694 generic.go:334] "Generic (PLEG): container finished" podID="428dd081-b1bb-404f-856a-f33a1fa7c24a" containerID="4def840a9ce1c58602b78dea39755e808372c982a588c0057faf28647396f7e5" exitCode=2 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.383298 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerDied","Data":"4def840a9ce1c58602b78dea39755e808372c982a588c0057faf28647396f7e5"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.383346 4694 scope.go:117] "RemoveContainer" containerID="2f8b78df13f727684a09b53f0d87b1724931413245104f7b26480a61df075b09" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.384140 4694 scope.go:117] "RemoveContainer" containerID="4def840a9ce1c58602b78dea39755e808372c982a588c0057faf28647396f7e5" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.390416 4694 generic.go:334] "Generic (PLEG): container finished" podID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerID="85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.390601 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerDied","Data":"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.403966 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-ovn\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404060 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-systemd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404111 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-slash\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404153 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-node-log\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404211 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-netns\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404259 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-netd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404293 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-ovn\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404315 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkq5c\" (UniqueName: \"kubernetes.io/projected/4a088de9-a5d7-48d7-ac21-039f1dbc6358-kube-api-access-lkq5c\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404345 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovnkube-controller/3.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404370 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-netns\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404371 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-env-overrides\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404401 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-slash\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404445 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-node-log\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404243 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-systemd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404478 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-netd\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404499 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-bin\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.404550 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405419 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-script-lib\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405487 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405535 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-log-socket\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405589 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-etc-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405659 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-env-overrides\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405686 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-var-lib-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405817 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-var-lib-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405827 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-kubelet\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405901 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-kubelet\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405905 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.405962 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406085 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-run-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406154 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-log-socket\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406211 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-etc-openvswitch\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406266 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406304 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-host-cni-bin\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406430 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovn-node-metrics-cert\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406483 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-systemd-units\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406535 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-config\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406729 4694 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406761 4694 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406792 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpm9k\" (UniqueName: \"kubernetes.io/projected/d15f1d18-d80a-4fc0-a710-a95c74465b6e-kube-api-access-hpm9k\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406816 4694 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406841 4694 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406866 4694 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d15f1d18-d80a-4fc0-a710-a95c74465b6e-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406891 4694 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406914 4694 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406936 4694 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406959 4694 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.406984 4694 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d15f1d18-d80a-4fc0-a710-a95c74465b6e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.407238 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-script-lib\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.407568 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a088de9-a5d7-48d7-ac21-039f1dbc6358-systemd-units\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.409850 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovn-acl-logging/0.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.412267 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8fjpm_d15f1d18-d80a-4fc0-a710-a95c74465b6e/ovn-controller/0.log" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.412802 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.412922 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.412994 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413079 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413148 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413214 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" exitCode=0 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413282 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" exitCode=143 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413348 4694 generic.go:334] "Generic (PLEG): container finished" podID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" exitCode=143 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413426 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413554 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413647 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413765 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413861 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.413933 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414059 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414142 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414208 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414287 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414346 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414405 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414468 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414525 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414582 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414658 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414734 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414801 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414867 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414927 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414984 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415148 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415233 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415302 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415384 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415460 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415519 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovnkube-config\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415276 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415531 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415851 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.414600 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a088de9-a5d7-48d7-ac21-039f1dbc6358-ovn-node-metrics-cert\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415904 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415933 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415946 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415958 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415972 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.415988 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416002 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416015 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416029 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416042 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416064 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fjpm" event={"ID":"d15f1d18-d80a-4fc0-a710-a95c74465b6e","Type":"ContainerDied","Data":"a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416091 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416110 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416127 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416141 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416152 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416165 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416176 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416186 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416196 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.416207 4694 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.444054 4694 scope.go:117] "RemoveContainer" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.444213 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkq5c\" (UniqueName: \"kubernetes.io/projected/4a088de9-a5d7-48d7-ac21-039f1dbc6358-kube-api-access-lkq5c\") pod \"ovnkube-node-qqvfj\" (UID: \"4a088de9-a5d7-48d7-ac21-039f1dbc6358\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.470867 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.473933 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fjpm"] Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.478592 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fjpm"] Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.503041 4694 scope.go:117] "RemoveContainer" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.512186 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.518803 4694 scope.go:117] "RemoveContainer" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.536751 4694 scope.go:117] "RemoveContainer" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.544365 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15f1d18_d80a_4fc0_a710_a95c74465b6e.slice/crio-a96ce4fc59b89f1167fcdd19815e797107f2a28859cd9561e5bb9d889bfcc8d3\": RecentStats: unable to find data in memory cache]" Feb 17 16:55:42 crc kubenswrapper[4694]: W0217 16:55:42.546155 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a088de9_a5d7_48d7_ac21_039f1dbc6358.slice/crio-cba6dd1428c2f79c3a5e8a8c800c58c421fcc336c3c7e8a2dc0ca85ddca5d252 WatchSource:0}: Error finding container cba6dd1428c2f79c3a5e8a8c800c58c421fcc336c3c7e8a2dc0ca85ddca5d252: Status 404 returned error can't find the container with id cba6dd1428c2f79c3a5e8a8c800c58c421fcc336c3c7e8a2dc0ca85ddca5d252 Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.554249 4694 scope.go:117] "RemoveContainer" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.570397 4694 scope.go:117] "RemoveContainer" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.588415 4694 scope.go:117] "RemoveContainer" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.604808 4694 scope.go:117] "RemoveContainer" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.629934 4694 scope.go:117] "RemoveContainer" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.645147 4694 scope.go:117] "RemoveContainer" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.645758 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": container with ID starting with 75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0 not found: ID does not exist" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.645794 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} err="failed to get container status \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": rpc error: code = NotFound desc = could not find container \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": container with ID starting with 75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.645820 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.646460 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": container with ID starting with 484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52 not found: ID does not exist" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.646501 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} err="failed to get container status \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": rpc error: code = NotFound desc = could not find container \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": container with ID starting with 484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.646532 4694 scope.go:117] "RemoveContainer" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.646990 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": container with ID starting with ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b not found: ID does not exist" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.647118 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} err="failed to get container status \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": rpc error: code = NotFound desc = could not find container \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": container with ID starting with ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.647229 4694 scope.go:117] "RemoveContainer" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.647718 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": container with ID starting with 46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4 not found: ID does not exist" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.647761 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} err="failed to get container status \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": rpc error: code = NotFound desc = could not find container \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": container with ID starting with 46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.647788 4694 scope.go:117] "RemoveContainer" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.648060 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": container with ID starting with 02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832 not found: ID does not exist" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648087 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} err="failed to get container status \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": rpc error: code = NotFound desc = could not find container \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": container with ID starting with 02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648104 4694 scope.go:117] "RemoveContainer" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.648336 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": container with ID starting with e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536 not found: ID does not exist" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648365 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} err="failed to get container status \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": rpc error: code = NotFound desc = could not find container \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": container with ID starting with e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648381 4694 scope.go:117] "RemoveContainer" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.648626 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": container with ID starting with 2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f not found: ID does not exist" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648660 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} err="failed to get container status \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": rpc error: code = NotFound desc = could not find container \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": container with ID starting with 2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648676 4694 scope.go:117] "RemoveContainer" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.648877 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": container with ID starting with 12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980 not found: ID does not exist" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648901 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} err="failed to get container status \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": rpc error: code = NotFound desc = could not find container \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": container with ID starting with 12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.648914 4694 scope.go:117] "RemoveContainer" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.649161 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": container with ID starting with 3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb not found: ID does not exist" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.649282 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} err="failed to get container status \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": rpc error: code = NotFound desc = could not find container \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": container with ID starting with 3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.649388 4694 scope.go:117] "RemoveContainer" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: E0217 16:55:42.649951 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": container with ID starting with 7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc not found: ID does not exist" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.649980 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} err="failed to get container status \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": rpc error: code = NotFound desc = could not find container \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": container with ID starting with 7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.649995 4694 scope.go:117] "RemoveContainer" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.650253 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} err="failed to get container status \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": rpc error: code = NotFound desc = could not find container \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": container with ID starting with 75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.650273 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.650557 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} err="failed to get container status \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": rpc error: code = NotFound desc = could not find container \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": container with ID starting with 484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.650581 4694 scope.go:117] "RemoveContainer" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.650888 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} err="failed to get container status \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": rpc error: code = NotFound desc = could not find container \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": container with ID starting with ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651009 4694 scope.go:117] "RemoveContainer" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651355 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} err="failed to get container status \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": rpc error: code = NotFound desc = could not find container \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": container with ID starting with 46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651384 4694 scope.go:117] "RemoveContainer" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651680 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} err="failed to get container status \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": rpc error: code = NotFound desc = could not find container \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": container with ID starting with 02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651700 4694 scope.go:117] "RemoveContainer" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651897 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} err="failed to get container status \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": rpc error: code = NotFound desc = could not find container \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": container with ID starting with e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.651923 4694 scope.go:117] "RemoveContainer" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.652483 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} err="failed to get container status \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": rpc error: code = NotFound desc = could not find container \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": container with ID starting with 2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.652593 4694 scope.go:117] "RemoveContainer" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.653043 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} err="failed to get container status \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": rpc error: code = NotFound desc = could not find container \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": container with ID starting with 12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.653072 4694 scope.go:117] "RemoveContainer" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.653579 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} err="failed to get container status \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": rpc error: code = NotFound desc = could not find container \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": container with ID starting with 3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.653854 4694 scope.go:117] "RemoveContainer" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.654336 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} err="failed to get container status \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": rpc error: code = NotFound desc = could not find container \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": container with ID starting with 7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.654493 4694 scope.go:117] "RemoveContainer" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.654934 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} err="failed to get container status \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": rpc error: code = NotFound desc = could not find container \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": container with ID starting with 75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.654958 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.655385 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} err="failed to get container status \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": rpc error: code = NotFound desc = could not find container \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": container with ID starting with 484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.655414 4694 scope.go:117] "RemoveContainer" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.655770 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} err="failed to get container status \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": rpc error: code = NotFound desc = could not find container \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": container with ID starting with ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.655878 4694 scope.go:117] "RemoveContainer" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.656239 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} err="failed to get container status \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": rpc error: code = NotFound desc = could not find container \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": container with ID starting with 46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.656267 4694 scope.go:117] "RemoveContainer" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.656563 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} err="failed to get container status \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": rpc error: code = NotFound desc = could not find container \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": container with ID starting with 02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.656680 4694 scope.go:117] "RemoveContainer" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.657054 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} err="failed to get container status \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": rpc error: code = NotFound desc = could not find container \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": container with ID starting with e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.657090 4694 scope.go:117] "RemoveContainer" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.657469 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} err="failed to get container status \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": rpc error: code = NotFound desc = could not find container \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": container with ID starting with 2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.657570 4694 scope.go:117] "RemoveContainer" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.657942 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} err="failed to get container status \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": rpc error: code = NotFound desc = could not find container \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": container with ID starting with 12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.658091 4694 scope.go:117] "RemoveContainer" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.658470 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} err="failed to get container status \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": rpc error: code = NotFound desc = could not find container \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": container with ID starting with 3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.658567 4694 scope.go:117] "RemoveContainer" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.658933 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} err="failed to get container status \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": rpc error: code = NotFound desc = could not find container \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": container with ID starting with 7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.659039 4694 scope.go:117] "RemoveContainer" containerID="75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.659460 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0"} err="failed to get container status \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": rpc error: code = NotFound desc = could not find container \"75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0\": container with ID starting with 75464f138f81a7c9f5e7ddc90d642a0e217797b22170db43b937cbe1f677ade0 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.659560 4694 scope.go:117] "RemoveContainer" containerID="484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.659858 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52"} err="failed to get container status \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": rpc error: code = NotFound desc = could not find container \"484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52\": container with ID starting with 484fc9d67acc129a11d46a41e76ca71c74af338cc4c4648858fab7c22111bc52 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.659972 4694 scope.go:117] "RemoveContainer" containerID="ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.660356 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b"} err="failed to get container status \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": rpc error: code = NotFound desc = could not find container \"ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b\": container with ID starting with ffa20853fed2537c24666c6d61ceae1cfab2cff58a8c89ba8bdd7a5715b1283b not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.660490 4694 scope.go:117] "RemoveContainer" containerID="46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.660859 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4"} err="failed to get container status \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": rpc error: code = NotFound desc = could not find container \"46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4\": container with ID starting with 46112908043f1edd9956a6df8389476268f1f8a763aebb5d3523c9666e36c5e4 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.660986 4694 scope.go:117] "RemoveContainer" containerID="02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.661565 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832"} err="failed to get container status \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": rpc error: code = NotFound desc = could not find container \"02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832\": container with ID starting with 02004f44ea4fa019a1dec8266657342b195b1ca6a836dfe42e2b2a9f2d28a832 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.661694 4694 scope.go:117] "RemoveContainer" containerID="e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.662133 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536"} err="failed to get container status \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": rpc error: code = NotFound desc = could not find container \"e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536\": container with ID starting with e8522d0f7301f0f8eb3dac7876798abbced91127686b2907631aa7406b46b536 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.662180 4694 scope.go:117] "RemoveContainer" containerID="2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.662682 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f"} err="failed to get container status \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": rpc error: code = NotFound desc = could not find container \"2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f\": container with ID starting with 2890ce756ac80f548107df0ade204ff47be1e5c01e849843817010601ba0613f not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.662791 4694 scope.go:117] "RemoveContainer" containerID="12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.663251 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980"} err="failed to get container status \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": rpc error: code = NotFound desc = could not find container \"12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980\": container with ID starting with 12093851af7070646afdfecd3f23ee8063e569878a48650f5e538bcac260b980 not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.663371 4694 scope.go:117] "RemoveContainer" containerID="3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.664790 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb"} err="failed to get container status \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": rpc error: code = NotFound desc = could not find container \"3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb\": container with ID starting with 3ae2fcb9264ec84486f0ce8f8fe2ac0a539b9c5af5cf07b2ad3c2640776ccacb not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.664819 4694 scope.go:117] "RemoveContainer" containerID="7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.666365 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc"} err="failed to get container status \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": rpc error: code = NotFound desc = could not find container \"7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc\": container with ID starting with 7f4c082125c1e796dbcd76de6ddd9734818de4be691967e8bd1560f5ba4effdc not found: ID does not exist" Feb 17 16:55:42 crc kubenswrapper[4694]: I0217 16:55:42.908184 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15f1d18-d80a-4fc0-a710-a95c74465b6e" path="/var/lib/kubelet/pods/d15f1d18-d80a-4fc0-a710-a95c74465b6e/volumes" Feb 17 16:55:43 crc kubenswrapper[4694]: I0217 16:55:43.421280 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pj7v4_428dd081-b1bb-404f-856a-f33a1fa7c24a/kube-multus/2.log" Feb 17 16:55:43 crc kubenswrapper[4694]: I0217 16:55:43.421529 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pj7v4" event={"ID":"428dd081-b1bb-404f-856a-f33a1fa7c24a","Type":"ContainerStarted","Data":"e58c810b6f43d39b2a105f8935b43deb8be028f0f5562152bee83c2645fbe20c"} Feb 17 16:55:43 crc kubenswrapper[4694]: I0217 16:55:43.422858 4694 generic.go:334] "Generic (PLEG): container finished" podID="4a088de9-a5d7-48d7-ac21-039f1dbc6358" containerID="5ee97e07df53f15e24e386733fb7dc890bd328354a4864cf38dc2dd37d17f519" exitCode=0 Feb 17 16:55:43 crc kubenswrapper[4694]: I0217 16:55:43.422884 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerDied","Data":"5ee97e07df53f15e24e386733fb7dc890bd328354a4864cf38dc2dd37d17f519"} Feb 17 16:55:43 crc kubenswrapper[4694]: I0217 16:55:43.422899 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"cba6dd1428c2f79c3a5e8a8c800c58c421fcc336c3c7e8a2dc0ca85ddca5d252"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.429075 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerStarted","Data":"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433595 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"b8435d474fb70d91226bfb29296563fd038f0f2b70495b7893251477afaa1ba0"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433652 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"d70becada300cf81576ae09ab05d968465f3ff1117d702cb1e4692d471b471c0"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433667 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"635cdcec6ef11157df797963ffdea8fa61dcfdba64cb3284eb7ec0e5001b897b"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433678 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"e7db819c243c66f72913b3a886527e4295f1f6392d55f92d3d87b7cf91b03104"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433689 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"c29a1ed78ae9aff87fcb78c5ad92c7e17229e9f0ba470bcfd7ed61973938d572"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.433700 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"d3ea2a8d91bfc823aa935487f4b690083a97d2f1429e9daf9787a2a25ae82ee0"} Feb 17 16:55:44 crc kubenswrapper[4694]: I0217 16:55:44.446535 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vq5rk" podStartSLOduration=2.04982306 podStartE2EDuration="5.446520073s" podCreationTimestamp="2026-02-17 16:55:39 +0000 UTC" firstStartedPulling="2026-02-17 16:55:40.367566729 +0000 UTC m=+808.124642053" lastFinishedPulling="2026-02-17 16:55:43.764263742 +0000 UTC m=+811.521339066" observedRunningTime="2026-02-17 16:55:44.443715074 +0000 UTC m=+812.200790388" watchObservedRunningTime="2026-02-17 16:55:44.446520073 +0000 UTC m=+812.203595387" Feb 17 16:55:47 crc kubenswrapper[4694]: I0217 16:55:47.458120 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"c52da450ca1a6fadb6f1682902ac617ce6053c850b892e9537d64a1cadb31605"} Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.478597 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" event={"ID":"4a088de9-a5d7-48d7-ac21-039f1dbc6358","Type":"ContainerStarted","Data":"a6e16059816f4c508b8275a6307eac4c743043e73a8b4f25be57aebfdc22ff7a"} Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.479146 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.479194 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.479214 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.510108 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.517111 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" podStartSLOduration=7.517090391 podStartE2EDuration="7.517090391s" podCreationTimestamp="2026-02-17 16:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:55:49.512419757 +0000 UTC m=+817.269495101" watchObservedRunningTime="2026-02-17 16:55:49.517090391 +0000 UTC m=+817.274165725" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.522831 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.690743 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:49 crc kubenswrapper[4694]: I0217 16:55:49.690813 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:50 crc kubenswrapper[4694]: I0217 16:55:50.751071 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vq5rk" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="registry-server" probeResult="failure" output=< Feb 17 16:55:50 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 16:55:50 crc kubenswrapper[4694]: > Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.414706 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.421844 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.422059 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.496514 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8kf2\" (UniqueName: \"kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.496642 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.496692 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.597761 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.597837 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.597905 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8kf2\" (UniqueName: \"kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.598470 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.598528 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.617334 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8kf2\" (UniqueName: \"kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2\") pod \"community-operators-nszmn\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.742504 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:55:56 crc kubenswrapper[4694]: I0217 16:55:56.993947 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:55:57 crc kubenswrapper[4694]: I0217 16:55:57.527680 4694 generic.go:334] "Generic (PLEG): container finished" podID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerID="ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793" exitCode=0 Feb 17 16:55:57 crc kubenswrapper[4694]: I0217 16:55:57.527755 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerDied","Data":"ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793"} Feb 17 16:55:57 crc kubenswrapper[4694]: I0217 16:55:57.527801 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerStarted","Data":"30cabbf1015736dbfb5ccf260c3d4ba4c5e6fe094292e205bf57fcfa74d5f1c2"} Feb 17 16:55:58 crc kubenswrapper[4694]: I0217 16:55:58.537400 4694 generic.go:334] "Generic (PLEG): container finished" podID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerID="355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8" exitCode=0 Feb 17 16:55:58 crc kubenswrapper[4694]: I0217 16:55:58.537553 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerDied","Data":"355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8"} Feb 17 16:55:59 crc kubenswrapper[4694]: I0217 16:55:59.547326 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerStarted","Data":"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63"} Feb 17 16:55:59 crc kubenswrapper[4694]: I0217 16:55:59.572192 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nszmn" podStartSLOduration=2.083855687 podStartE2EDuration="3.572169208s" podCreationTimestamp="2026-02-17 16:55:56 +0000 UTC" firstStartedPulling="2026-02-17 16:55:57.529822415 +0000 UTC m=+825.286897779" lastFinishedPulling="2026-02-17 16:55:59.018135936 +0000 UTC m=+826.775211300" observedRunningTime="2026-02-17 16:55:59.570122008 +0000 UTC m=+827.327197362" watchObservedRunningTime="2026-02-17 16:55:59.572169208 +0000 UTC m=+827.329244552" Feb 17 16:55:59 crc kubenswrapper[4694]: I0217 16:55:59.742886 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:55:59 crc kubenswrapper[4694]: I0217 16:55:59.803974 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:56:01 crc kubenswrapper[4694]: I0217 16:56:01.788196 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:56:01 crc kubenswrapper[4694]: I0217 16:56:01.789028 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vq5rk" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="registry-server" containerID="cri-o://f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd" gracePeriod=2 Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.209110 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.263740 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content\") pod \"be187bfc-0eab-4648-95da-0d2e0100b6cb\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.263827 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wkfc\" (UniqueName: \"kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc\") pod \"be187bfc-0eab-4648-95da-0d2e0100b6cb\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.263970 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities\") pod \"be187bfc-0eab-4648-95da-0d2e0100b6cb\" (UID: \"be187bfc-0eab-4648-95da-0d2e0100b6cb\") " Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.265214 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities" (OuterVolumeSpecName: "utilities") pod "be187bfc-0eab-4648-95da-0d2e0100b6cb" (UID: "be187bfc-0eab-4648-95da-0d2e0100b6cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.273010 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc" (OuterVolumeSpecName: "kube-api-access-6wkfc") pod "be187bfc-0eab-4648-95da-0d2e0100b6cb" (UID: "be187bfc-0eab-4648-95da-0d2e0100b6cb"). InnerVolumeSpecName "kube-api-access-6wkfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.365208 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wkfc\" (UniqueName: \"kubernetes.io/projected/be187bfc-0eab-4648-95da-0d2e0100b6cb-kube-api-access-6wkfc\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.365246 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.388333 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be187bfc-0eab-4648-95da-0d2e0100b6cb" (UID: "be187bfc-0eab-4648-95da-0d2e0100b6cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.466270 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be187bfc-0eab-4648-95da-0d2e0100b6cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.564875 4694 generic.go:334] "Generic (PLEG): container finished" podID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerID="f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd" exitCode=0 Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.564944 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq5rk" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.564965 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerDied","Data":"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd"} Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.565748 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq5rk" event={"ID":"be187bfc-0eab-4648-95da-0d2e0100b6cb","Type":"ContainerDied","Data":"009a1318d639947860c36be0a724915a6cd09dac0dff226490c478ab1c9b1a91"} Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.565783 4694 scope.go:117] "RemoveContainer" containerID="f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.580520 4694 scope.go:117] "RemoveContainer" containerID="85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.597563 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.600895 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vq5rk"] Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.603826 4694 scope.go:117] "RemoveContainer" containerID="af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.619021 4694 scope.go:117] "RemoveContainer" containerID="f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd" Feb 17 16:56:02 crc kubenswrapper[4694]: E0217 16:56:02.619395 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd\": container with ID starting with f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd not found: ID does not exist" containerID="f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.619437 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd"} err="failed to get container status \"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd\": rpc error: code = NotFound desc = could not find container \"f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd\": container with ID starting with f96593ba374b81f31aa1accbb0e61b01cbfd8b69275b082a519915d7a12b26dd not found: ID does not exist" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.619469 4694 scope.go:117] "RemoveContainer" containerID="85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03" Feb 17 16:56:02 crc kubenswrapper[4694]: E0217 16:56:02.619769 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03\": container with ID starting with 85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03 not found: ID does not exist" containerID="85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.619808 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03"} err="failed to get container status \"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03\": rpc error: code = NotFound desc = could not find container \"85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03\": container with ID starting with 85584924bdd8b94a183fc3d9362358b67c2935bb4ab49faecdb1cc392b953a03 not found: ID does not exist" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.619833 4694 scope.go:117] "RemoveContainer" containerID="af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712" Feb 17 16:56:02 crc kubenswrapper[4694]: E0217 16:56:02.620065 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712\": container with ID starting with af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712 not found: ID does not exist" containerID="af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.620086 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712"} err="failed to get container status \"af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712\": rpc error: code = NotFound desc = could not find container \"af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712\": container with ID starting with af6227e665e81a072369c401dd61b92e6e35c9be477c9d1cc621bac419eff712 not found: ID does not exist" Feb 17 16:56:02 crc kubenswrapper[4694]: I0217 16:56:02.908735 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" path="/var/lib/kubelet/pods/be187bfc-0eab-4648-95da-0d2e0100b6cb/volumes" Feb 17 16:56:06 crc kubenswrapper[4694]: I0217 16:56:06.742841 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:06 crc kubenswrapper[4694]: I0217 16:56:06.743424 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:06 crc kubenswrapper[4694]: I0217 16:56:06.813725 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:07 crc kubenswrapper[4694]: I0217 16:56:07.654150 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:07 crc kubenswrapper[4694]: I0217 16:56:07.703557 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:56:09 crc kubenswrapper[4694]: I0217 16:56:09.617563 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nszmn" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="registry-server" containerID="cri-o://ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63" gracePeriod=2 Feb 17 16:56:09 crc kubenswrapper[4694]: I0217 16:56:09.987754 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.061478 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8kf2\" (UniqueName: \"kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2\") pod \"ecd32697-02c8-45c5-ac14-51097ed31af5\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.061522 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content\") pod \"ecd32697-02c8-45c5-ac14-51097ed31af5\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.061564 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities\") pod \"ecd32697-02c8-45c5-ac14-51097ed31af5\" (UID: \"ecd32697-02c8-45c5-ac14-51097ed31af5\") " Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.062894 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities" (OuterVolumeSpecName: "utilities") pod "ecd32697-02c8-45c5-ac14-51097ed31af5" (UID: "ecd32697-02c8-45c5-ac14-51097ed31af5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.068441 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2" (OuterVolumeSpecName: "kube-api-access-h8kf2") pod "ecd32697-02c8-45c5-ac14-51097ed31af5" (UID: "ecd32697-02c8-45c5-ac14-51097ed31af5"). InnerVolumeSpecName "kube-api-access-h8kf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.116692 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecd32697-02c8-45c5-ac14-51097ed31af5" (UID: "ecd32697-02c8-45c5-ac14-51097ed31af5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.163378 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8kf2\" (UniqueName: \"kubernetes.io/projected/ecd32697-02c8-45c5-ac14-51097ed31af5-kube-api-access-h8kf2\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.163405 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.163414 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecd32697-02c8-45c5-ac14-51097ed31af5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.627991 4694 generic.go:334] "Generic (PLEG): container finished" podID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerID="ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63" exitCode=0 Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.628047 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerDied","Data":"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63"} Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.628078 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nszmn" event={"ID":"ecd32697-02c8-45c5-ac14-51097ed31af5","Type":"ContainerDied","Data":"30cabbf1015736dbfb5ccf260c3d4ba4c5e6fe094292e205bf57fcfa74d5f1c2"} Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.628098 4694 scope.go:117] "RemoveContainer" containerID="ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.628123 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nszmn" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.655336 4694 scope.go:117] "RemoveContainer" containerID="355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.681446 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.684240 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nszmn"] Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.698879 4694 scope.go:117] "RemoveContainer" containerID="ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.718139 4694 scope.go:117] "RemoveContainer" containerID="ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63" Feb 17 16:56:10 crc kubenswrapper[4694]: E0217 16:56:10.718595 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63\": container with ID starting with ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63 not found: ID does not exist" containerID="ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.718672 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63"} err="failed to get container status \"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63\": rpc error: code = NotFound desc = could not find container \"ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63\": container with ID starting with ef5bc579d5c2ac33cfca61d58a2eae9e791a411a98305427c454a212a2e75c63 not found: ID does not exist" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.718708 4694 scope.go:117] "RemoveContainer" containerID="355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8" Feb 17 16:56:10 crc kubenswrapper[4694]: E0217 16:56:10.719004 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8\": container with ID starting with 355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8 not found: ID does not exist" containerID="355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.719034 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8"} err="failed to get container status \"355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8\": rpc error: code = NotFound desc = could not find container \"355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8\": container with ID starting with 355a9c7e2f6a382517e717a6c0a42a217aefe6040d9eb27bc69647cf1172e0b8 not found: ID does not exist" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.719054 4694 scope.go:117] "RemoveContainer" containerID="ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793" Feb 17 16:56:10 crc kubenswrapper[4694]: E0217 16:56:10.719386 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793\": container with ID starting with ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793 not found: ID does not exist" containerID="ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.719455 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793"} err="failed to get container status \"ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793\": rpc error: code = NotFound desc = could not find container \"ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793\": container with ID starting with ddd57d48e39ecd9ba8890ad82d73bc2fef416fd73e49ccca061bdc5c26375793 not found: ID does not exist" Feb 17 16:56:10 crc kubenswrapper[4694]: I0217 16:56:10.908308 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" path="/var/lib/kubelet/pods/ecd32697-02c8-45c5-ac14-51097ed31af5/volumes" Feb 17 16:56:12 crc kubenswrapper[4694]: I0217 16:56:12.548143 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqvfj" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:19.999597 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct"] Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000404 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="extract-content" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000417 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="extract-content" Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000427 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000433 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000442 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="extract-utilities" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000448 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="extract-utilities" Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000455 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="extract-utilities" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000461 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="extract-utilities" Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000470 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="extract-content" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000475 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="extract-content" Feb 17 16:56:20 crc kubenswrapper[4694]: E0217 16:56:20.000484 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000490 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000579 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="be187bfc-0eab-4648-95da-0d2e0100b6cb" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.000589 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd32697-02c8-45c5-ac14-51097ed31af5" containerName="registry-server" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.001229 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.004430 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.019881 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct"] Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.099249 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hxjj\" (UniqueName: \"kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.099344 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.099461 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.201160 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hxjj\" (UniqueName: \"kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.201247 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.201333 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.202101 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.202170 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.236781 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hxjj\" (UniqueName: \"kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.321574 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.554180 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct"] Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.696265 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerStarted","Data":"b46a1c709d62ffc06e5d213604303072195522acaed737a16aba3909c9be2eba"} Feb 17 16:56:20 crc kubenswrapper[4694]: I0217 16:56:20.696329 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerStarted","Data":"b631ab40f1b61268ed3913fd5632ccf4c162e423bbef506790f90695e22f9531"} Feb 17 16:56:21 crc kubenswrapper[4694]: I0217 16:56:21.707911 4694 generic.go:334] "Generic (PLEG): container finished" podID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerID="b46a1c709d62ffc06e5d213604303072195522acaed737a16aba3909c9be2eba" exitCode=0 Feb 17 16:56:21 crc kubenswrapper[4694]: I0217 16:56:21.707974 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerDied","Data":"b46a1c709d62ffc06e5d213604303072195522acaed737a16aba3909c9be2eba"} Feb 17 16:56:23 crc kubenswrapper[4694]: I0217 16:56:23.720723 4694 generic.go:334] "Generic (PLEG): container finished" podID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerID="c0dfbc46a9f6269a28bc721b888528a27b5454d3287611b6071a0588dc4790bc" exitCode=0 Feb 17 16:56:23 crc kubenswrapper[4694]: I0217 16:56:23.720800 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerDied","Data":"c0dfbc46a9f6269a28bc721b888528a27b5454d3287611b6071a0588dc4790bc"} Feb 17 16:56:24 crc kubenswrapper[4694]: I0217 16:56:24.731106 4694 generic.go:334] "Generic (PLEG): container finished" podID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerID="baa640b56a4c0ae103fdb3546959c2d595182e7419c9dec48d543e6f1b298925" exitCode=0 Feb 17 16:56:24 crc kubenswrapper[4694]: I0217 16:56:24.731174 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerDied","Data":"baa640b56a4c0ae103fdb3546959c2d595182e7419c9dec48d543e6f1b298925"} Feb 17 16:56:25 crc kubenswrapper[4694]: I0217 16:56:25.970736 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.072309 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util\") pod \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.072367 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle\") pod \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.072415 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hxjj\" (UniqueName: \"kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj\") pod \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\" (UID: \"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f\") " Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.073019 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle" (OuterVolumeSpecName: "bundle") pod "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" (UID: "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.078851 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj" (OuterVolumeSpecName: "kube-api-access-7hxjj") pod "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" (UID: "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f"). InnerVolumeSpecName "kube-api-access-7hxjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.174760 4694 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.174829 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hxjj\" (UniqueName: \"kubernetes.io/projected/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-kube-api-access-7hxjj\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.260134 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util" (OuterVolumeSpecName: "util") pod "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" (UID: "664fb1f1-f1c6-483a-894d-8bf5ab3ec09f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.276384 4694 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/664fb1f1-f1c6-483a-894d-8bf5ab3ec09f-util\") on node \"crc\" DevicePath \"\"" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.749806 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" event={"ID":"664fb1f1-f1c6-483a-894d-8bf5ab3ec09f","Type":"ContainerDied","Data":"b631ab40f1b61268ed3913fd5632ccf4c162e423bbef506790f90695e22f9531"} Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.750204 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b631ab40f1b61268ed3913fd5632ccf4c162e423bbef506790f90695e22f9531" Feb 17 16:56:26 crc kubenswrapper[4694]: I0217 16:56:26.749880 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.586665 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-m65zv"] Feb 17 16:56:31 crc kubenswrapper[4694]: E0217 16:56:31.589429 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="util" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.589510 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="util" Feb 17 16:56:31 crc kubenswrapper[4694]: E0217 16:56:31.589564 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="extract" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.589702 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="extract" Feb 17 16:56:31 crc kubenswrapper[4694]: E0217 16:56:31.589768 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="pull" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.589829 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="pull" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.590037 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="664fb1f1-f1c6-483a-894d-8bf5ab3ec09f" containerName="extract" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.590595 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.593177 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.593449 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ltjnf" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.593752 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.605361 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-m65zv"] Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.643713 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9299z\" (UniqueName: \"kubernetes.io/projected/c5fe0950-b548-4cdc-9e9d-c2483a8213d9-kube-api-access-9299z\") pod \"nmstate-operator-694c9596b7-m65zv\" (UID: \"c5fe0950-b548-4cdc-9e9d-c2483a8213d9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.744464 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9299z\" (UniqueName: \"kubernetes.io/projected/c5fe0950-b548-4cdc-9e9d-c2483a8213d9-kube-api-access-9299z\") pod \"nmstate-operator-694c9596b7-m65zv\" (UID: \"c5fe0950-b548-4cdc-9e9d-c2483a8213d9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.762859 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9299z\" (UniqueName: \"kubernetes.io/projected/c5fe0950-b548-4cdc-9e9d-c2483a8213d9-kube-api-access-9299z\") pod \"nmstate-operator-694c9596b7-m65zv\" (UID: \"c5fe0950-b548-4cdc-9e9d-c2483a8213d9\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" Feb 17 16:56:31 crc kubenswrapper[4694]: I0217 16:56:31.913512 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" Feb 17 16:56:32 crc kubenswrapper[4694]: I0217 16:56:32.106903 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-m65zv"] Feb 17 16:56:32 crc kubenswrapper[4694]: I0217 16:56:32.791193 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" event={"ID":"c5fe0950-b548-4cdc-9e9d-c2483a8213d9","Type":"ContainerStarted","Data":"9a02fa10757ee66f9f59b6ecac23b478d139ac015016c303524da4a0dfbf5f66"} Feb 17 16:56:34 crc kubenswrapper[4694]: I0217 16:56:34.806083 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" event={"ID":"c5fe0950-b548-4cdc-9e9d-c2483a8213d9","Type":"ContainerStarted","Data":"795914ab2ecacdd62f0bda1e6fee80c0d36173252c4b26aee789ae6af15e98e9"} Feb 17 16:56:34 crc kubenswrapper[4694]: I0217 16:56:34.830565 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-m65zv" podStartSLOduration=1.689515871 podStartE2EDuration="3.830535666s" podCreationTimestamp="2026-02-17 16:56:31 +0000 UTC" firstStartedPulling="2026-02-17 16:56:32.118740441 +0000 UTC m=+859.875815755" lastFinishedPulling="2026-02-17 16:56:34.259760216 +0000 UTC m=+862.016835550" observedRunningTime="2026-02-17 16:56:34.829589032 +0000 UTC m=+862.586664406" watchObservedRunningTime="2026-02-17 16:56:34.830535666 +0000 UTC m=+862.587611030" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.420429 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.422069 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.423987 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rn2vj" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.438858 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.446117 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.447081 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.454785 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxtds\" (UniqueName: \"kubernetes.io/projected/aa8c8617-1bdc-461a-9aea-d534da85b5e4-kube-api-access-lxtds\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.454948 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5sn\" (UniqueName: \"kubernetes.io/projected/e83663da-470f-4ebf-ac6b-64612e8724f4-kube-api-access-tb5sn\") pod \"nmstate-metrics-58c85c668d-jsvrg\" (UID: \"e83663da-470f-4ebf-ac6b-64612e8724f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.454986 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.456365 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-t7txz"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.456980 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.458928 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.479168 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.555941 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5sn\" (UniqueName: \"kubernetes.io/projected/e83663da-470f-4ebf-ac6b-64612e8724f4-kube-api-access-tb5sn\") pod \"nmstate-metrics-58c85c668d-jsvrg\" (UID: \"e83663da-470f-4ebf-ac6b-64612e8724f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.555983 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.556031 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxtds\" (UniqueName: \"kubernetes.io/projected/aa8c8617-1bdc-461a-9aea-d534da85b5e4-kube-api-access-lxtds\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: E0217 16:56:40.556183 4694 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 16:56:40 crc kubenswrapper[4694]: E0217 16:56:40.556259 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair podName:aa8c8617-1bdc-461a-9aea-d534da85b5e4 nodeName:}" failed. No retries permitted until 2026-02-17 16:56:41.056240905 +0000 UTC m=+868.813316229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair") pod "nmstate-webhook-866bcb46dc-whp8f" (UID: "aa8c8617-1bdc-461a-9aea-d534da85b5e4") : secret "openshift-nmstate-webhook" not found Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.580325 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.581024 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.591992 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.592184 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5bbqj" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.592218 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.597417 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxtds\" (UniqueName: \"kubernetes.io/projected/aa8c8617-1bdc-461a-9aea-d534da85b5e4-kube-api-access-lxtds\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.598083 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5sn\" (UniqueName: \"kubernetes.io/projected/e83663da-470f-4ebf-ac6b-64612e8724f4-kube-api-access-tb5sn\") pod \"nmstate-metrics-58c85c668d-jsvrg\" (UID: \"e83663da-470f-4ebf-ac6b-64612e8724f4\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.601790 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.657011 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjd5\" (UniqueName: \"kubernetes.io/projected/13812f03-334f-44fa-9c5c-c5d257756b27-kube-api-access-xkjd5\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.657092 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-ovs-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.657620 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-nmstate-lock\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.657647 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-dbus-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.754504 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.760224 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c8bfd7975-5j7f9"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.760921 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-ovs-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761054 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-nmstate-lock\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761143 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-dbus-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761518 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f3c6b99-3db1-447f-b31c-1692a70ec415-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761054 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-ovs-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761690 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761465 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-dbus-socket\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761085 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/13812f03-334f-44fa-9c5c-c5d257756b27-nmstate-lock\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761381 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.761907 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnk6h\" (UniqueName: \"kubernetes.io/projected/9f3c6b99-3db1-447f-b31c-1692a70ec415-kube-api-access-rnk6h\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.762044 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjd5\" (UniqueName: \"kubernetes.io/projected/13812f03-334f-44fa-9c5c-c5d257756b27-kube-api-access-xkjd5\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.782780 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c8bfd7975-5j7f9"] Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.787025 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjd5\" (UniqueName: \"kubernetes.io/projected/13812f03-334f-44fa-9c5c-c5d257756b27-kube-api-access-xkjd5\") pod \"nmstate-handler-t7txz\" (UID: \"13812f03-334f-44fa-9c5c-c5d257756b27\") " pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.793850 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.848812 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t7txz" event={"ID":"13812f03-334f-44fa-9c5c-c5d257756b27","Type":"ContainerStarted","Data":"846ca066eb40652ad5989c3e9987f1c3c4f421fe2c4087305f6bf5b8cf916692"} Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862713 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f3c6b99-3db1-447f-b31c-1692a70ec415-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862748 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862768 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862784 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-trusted-ca-bundle\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862803 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25j9\" (UniqueName: \"kubernetes.io/projected/1dcf9418-b5e5-4513-9882-e02e009b9bc6-kube-api-access-s25j9\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862820 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-oauth-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862836 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862873 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnk6h\" (UniqueName: \"kubernetes.io/projected/9f3c6b99-3db1-447f-b31c-1692a70ec415-kube-api-access-rnk6h\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862900 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-service-ca\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.862940 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-oauth-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.863800 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9f3c6b99-3db1-447f-b31c-1692a70ec415-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: E0217 16:56:40.863883 4694 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 16:56:40 crc kubenswrapper[4694]: E0217 16:56:40.863929 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert podName:9f3c6b99-3db1-447f-b31c-1692a70ec415 nodeName:}" failed. No retries permitted until 2026-02-17 16:56:41.363913541 +0000 UTC m=+869.120988865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-6xx9x" (UID: "9f3c6b99-3db1-447f-b31c-1692a70ec415") : secret "plugin-serving-cert" not found Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.913822 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnk6h\" (UniqueName: \"kubernetes.io/projected/9f3c6b99-3db1-447f-b31c-1692a70ec415-kube-api-access-rnk6h\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964314 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-oauth-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964540 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964567 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-trusted-ca-bundle\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964586 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25j9\" (UniqueName: \"kubernetes.io/projected/1dcf9418-b5e5-4513-9882-e02e009b9bc6-kube-api-access-s25j9\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964629 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-oauth-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964646 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.964669 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-service-ca\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.965574 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-service-ca\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.966381 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-oauth-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.966766 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-trusted-ca-bundle\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.967957 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.970187 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-serving-cert\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.970212 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1dcf9418-b5e5-4513-9882-e02e009b9bc6-console-oauth-config\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:40 crc kubenswrapper[4694]: I0217 16:56:40.982394 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25j9\" (UniqueName: \"kubernetes.io/projected/1dcf9418-b5e5-4513-9882-e02e009b9bc6-kube-api-access-s25j9\") pod \"console-7c8bfd7975-5j7f9\" (UID: \"1dcf9418-b5e5-4513-9882-e02e009b9bc6\") " pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.065852 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.070345 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa8c8617-1bdc-461a-9aea-d534da85b5e4-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-whp8f\" (UID: \"aa8c8617-1bdc-461a-9aea-d534da85b5e4\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.077966 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.122171 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.214565 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg"] Feb 17 16:56:41 crc kubenswrapper[4694]: W0217 16:56:41.220567 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode83663da_470f_4ebf_ac6b_64612e8724f4.slice/crio-b38f3afb25c7517fd415c51663eaa1047e78fa8ffd1f21a488acc3f2d94513f3 WatchSource:0}: Error finding container b38f3afb25c7517fd415c51663eaa1047e78fa8ffd1f21a488acc3f2d94513f3: Status 404 returned error can't find the container with id b38f3afb25c7517fd415c51663eaa1047e78fa8ffd1f21a488acc3f2d94513f3 Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.311004 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f"] Feb 17 16:56:41 crc kubenswrapper[4694]: W0217 16:56:41.318018 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa8c8617_1bdc_461a_9aea_d534da85b5e4.slice/crio-c5e23ef73d8f333b4998e3d5c16a03afa9314333c0224593dd64cb36c9fc250b WatchSource:0}: Error finding container c5e23ef73d8f333b4998e3d5c16a03afa9314333c0224593dd64cb36c9fc250b: Status 404 returned error can't find the container with id c5e23ef73d8f333b4998e3d5c16a03afa9314333c0224593dd64cb36c9fc250b Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.349460 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c8bfd7975-5j7f9"] Feb 17 16:56:41 crc kubenswrapper[4694]: W0217 16:56:41.354236 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dcf9418_b5e5_4513_9882_e02e009b9bc6.slice/crio-f686dfead319a466797022c6b43798595461dc878e668bfc9c793754f522c6da WatchSource:0}: Error finding container f686dfead319a466797022c6b43798595461dc878e668bfc9c793754f522c6da: Status 404 returned error can't find the container with id f686dfead319a466797022c6b43798595461dc878e668bfc9c793754f522c6da Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.369204 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.373788 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3c6b99-3db1-447f-b31c-1692a70ec415-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-6xx9x\" (UID: \"9f3c6b99-3db1-447f-b31c-1692a70ec415\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.530825 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.740087 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x"] Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.856298 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" event={"ID":"e83663da-470f-4ebf-ac6b-64612e8724f4","Type":"ContainerStarted","Data":"b38f3afb25c7517fd415c51663eaa1047e78fa8ffd1f21a488acc3f2d94513f3"} Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.857467 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" event={"ID":"9f3c6b99-3db1-447f-b31c-1692a70ec415","Type":"ContainerStarted","Data":"fadf54cd19524920ba8270f391167b08095f345dbb84cd5d8f20b8f439593050"} Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.858512 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" event={"ID":"aa8c8617-1bdc-461a-9aea-d534da85b5e4","Type":"ContainerStarted","Data":"c5e23ef73d8f333b4998e3d5c16a03afa9314333c0224593dd64cb36c9fc250b"} Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.860066 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c8bfd7975-5j7f9" event={"ID":"1dcf9418-b5e5-4513-9882-e02e009b9bc6","Type":"ContainerStarted","Data":"5052a4056b067d954a0fa6908c6948c53101f7034e6d3686571e4bec084bb4ee"} Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.860102 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c8bfd7975-5j7f9" event={"ID":"1dcf9418-b5e5-4513-9882-e02e009b9bc6","Type":"ContainerStarted","Data":"f686dfead319a466797022c6b43798595461dc878e668bfc9c793754f522c6da"} Feb 17 16:56:41 crc kubenswrapper[4694]: I0217 16:56:41.882746 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c8bfd7975-5j7f9" podStartSLOduration=1.8827291430000002 podStartE2EDuration="1.882729143s" podCreationTimestamp="2026-02-17 16:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:56:41.878304565 +0000 UTC m=+869.635379889" watchObservedRunningTime="2026-02-17 16:56:41.882729143 +0000 UTC m=+869.639804467" Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.873738 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" event={"ID":"aa8c8617-1bdc-461a-9aea-d534da85b5e4","Type":"ContainerStarted","Data":"301ba0de0b7bb33a08b70205705f6d70cc06f2ed2301f36ecabd9cd174f1c84a"} Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.874481 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.875905 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-t7txz" event={"ID":"13812f03-334f-44fa-9c5c-c5d257756b27","Type":"ContainerStarted","Data":"51f703a142f5dc2451c43e7d6045e9178a9b2e08be6e9c7b236823126ad93684"} Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.876050 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.877567 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" event={"ID":"e83663da-470f-4ebf-ac6b-64612e8724f4","Type":"ContainerStarted","Data":"92d782a259d7b32ffead14eca25f490731e6c90519880ea2704fa39732d8f46d"} Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.891432 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" podStartSLOduration=1.642186892 podStartE2EDuration="3.891414238s" podCreationTimestamp="2026-02-17 16:56:40 +0000 UTC" firstStartedPulling="2026-02-17 16:56:41.320631986 +0000 UTC m=+869.077707310" lastFinishedPulling="2026-02-17 16:56:43.569859332 +0000 UTC m=+871.326934656" observedRunningTime="2026-02-17 16:56:43.890002753 +0000 UTC m=+871.647078077" watchObservedRunningTime="2026-02-17 16:56:43.891414238 +0000 UTC m=+871.648489562" Feb 17 16:56:43 crc kubenswrapper[4694]: I0217 16:56:43.908471 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-t7txz" podStartSLOduration=1.219975631 podStartE2EDuration="3.908454235s" podCreationTimestamp="2026-02-17 16:56:40 +0000 UTC" firstStartedPulling="2026-02-17 16:56:40.839415821 +0000 UTC m=+868.596491145" lastFinishedPulling="2026-02-17 16:56:43.527894425 +0000 UTC m=+871.284969749" observedRunningTime="2026-02-17 16:56:43.905265657 +0000 UTC m=+871.662340981" watchObservedRunningTime="2026-02-17 16:56:43.908454235 +0000 UTC m=+871.665529569" Feb 17 16:56:44 crc kubenswrapper[4694]: I0217 16:56:44.887118 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" event={"ID":"9f3c6b99-3db1-447f-b31c-1692a70ec415","Type":"ContainerStarted","Data":"414f968684664e51a0a41d6cef528fc249ae9661a9b7b1f290d35cbf5b366ad8"} Feb 17 16:56:44 crc kubenswrapper[4694]: I0217 16:56:44.916462 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-6xx9x" podStartSLOduration=2.188383029 podStartE2EDuration="4.916435062s" podCreationTimestamp="2026-02-17 16:56:40 +0000 UTC" firstStartedPulling="2026-02-17 16:56:41.747259455 +0000 UTC m=+869.504334779" lastFinishedPulling="2026-02-17 16:56:44.475311488 +0000 UTC m=+872.232386812" observedRunningTime="2026-02-17 16:56:44.906728964 +0000 UTC m=+872.663804288" watchObservedRunningTime="2026-02-17 16:56:44.916435062 +0000 UTC m=+872.673510406" Feb 17 16:56:46 crc kubenswrapper[4694]: I0217 16:56:46.902654 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" event={"ID":"e83663da-470f-4ebf-ac6b-64612e8724f4","Type":"ContainerStarted","Data":"dc0e6efc54aa12f2e62f6e1c6e0d6928d261050f68ae78f0773036581e2f8bc1"} Feb 17 16:56:46 crc kubenswrapper[4694]: I0217 16:56:46.932351 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-jsvrg" podStartSLOduration=2.183330234 podStartE2EDuration="6.932322853s" podCreationTimestamp="2026-02-17 16:56:40 +0000 UTC" firstStartedPulling="2026-02-17 16:56:41.222878252 +0000 UTC m=+868.979953566" lastFinishedPulling="2026-02-17 16:56:45.971870861 +0000 UTC m=+873.728946185" observedRunningTime="2026-02-17 16:56:46.918527175 +0000 UTC m=+874.675602499" watchObservedRunningTime="2026-02-17 16:56:46.932322853 +0000 UTC m=+874.689398217" Feb 17 16:56:50 crc kubenswrapper[4694]: I0217 16:56:50.816154 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-t7txz" Feb 17 16:56:51 crc kubenswrapper[4694]: I0217 16:56:51.123072 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:51 crc kubenswrapper[4694]: I0217 16:56:51.123436 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:51 crc kubenswrapper[4694]: I0217 16:56:51.129173 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:51 crc kubenswrapper[4694]: I0217 16:56:51.941892 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c8bfd7975-5j7f9" Feb 17 16:56:52 crc kubenswrapper[4694]: I0217 16:56:52.014283 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:57:01 crc kubenswrapper[4694]: I0217 16:57:01.089185 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-whp8f" Feb 17 16:57:14 crc kubenswrapper[4694]: I0217 16:57:14.978103 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv"] Feb 17 16:57:14 crc kubenswrapper[4694]: I0217 16:57:14.980529 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:14 crc kubenswrapper[4694]: I0217 16:57:14.983542 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 16:57:14 crc kubenswrapper[4694]: I0217 16:57:14.985969 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv"] Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.076989 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.077035 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.077224 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbr2s\" (UniqueName: \"kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.177907 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbr2s\" (UniqueName: \"kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.177963 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.177982 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.178444 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.178899 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.201688 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbr2s\" (UniqueName: \"kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.341901 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:15 crc kubenswrapper[4694]: I0217 16:57:15.557791 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv"] Feb 17 16:57:16 crc kubenswrapper[4694]: I0217 16:57:16.087740 4694 generic.go:334] "Generic (PLEG): container finished" podID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerID="505728c8259b78881f95163fcc7f8ff69163b2d01b58f657ae418895a619ad82" exitCode=0 Feb 17 16:57:16 crc kubenswrapper[4694]: I0217 16:57:16.087827 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" event={"ID":"9d0c9969-a494-496f-bc4d-721e0a4ac013","Type":"ContainerDied","Data":"505728c8259b78881f95163fcc7f8ff69163b2d01b58f657ae418895a619ad82"} Feb 17 16:57:16 crc kubenswrapper[4694]: I0217 16:57:16.088133 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" event={"ID":"9d0c9969-a494-496f-bc4d-721e0a4ac013","Type":"ContainerStarted","Data":"2439806dc9b0f96fc098a6aeb7c7234c270ae347aba8e67a056f1d85f4a745ab"} Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.057685 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-896vh" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" containerID="cri-o://07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706" gracePeriod=15 Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.451227 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-896vh_4387c481-04e8-4060-affe-f9b6fc0b1406/console/0.log" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.451682 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.517293 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.517378 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.517449 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s2vh\" (UniqueName: \"kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.517490 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.517572 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.518296 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.518354 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert\") pod \"4387c481-04e8-4060-affe-f9b6fc0b1406\" (UID: \"4387c481-04e8-4060-affe-f9b6fc0b1406\") " Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.521537 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.521693 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.521718 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca" (OuterVolumeSpecName: "service-ca") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.522021 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config" (OuterVolumeSpecName: "console-config") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.526359 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.526681 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.526721 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh" (OuterVolumeSpecName: "kube-api-access-2s2vh") pod "4387c481-04e8-4060-affe-f9b6fc0b1406" (UID: "4387c481-04e8-4060-affe-f9b6fc0b1406"). InnerVolumeSpecName "kube-api-access-2s2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.619880 4694 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620168 4694 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4387c481-04e8-4060-affe-f9b6fc0b1406-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620265 4694 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620283 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s2vh\" (UniqueName: \"kubernetes.io/projected/4387c481-04e8-4060-affe-f9b6fc0b1406-kube-api-access-2s2vh\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620295 4694 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620307 4694 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:17 crc kubenswrapper[4694]: I0217 16:57:17.620318 4694 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4387c481-04e8-4060-affe-f9b6fc0b1406-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105001 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-896vh_4387c481-04e8-4060-affe-f9b6fc0b1406/console/0.log" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105080 4694 generic.go:334] "Generic (PLEG): container finished" podID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerID="07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706" exitCode=2 Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105122 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-896vh" event={"ID":"4387c481-04e8-4060-affe-f9b6fc0b1406","Type":"ContainerDied","Data":"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706"} Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105160 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-896vh" event={"ID":"4387c481-04e8-4060-affe-f9b6fc0b1406","Type":"ContainerDied","Data":"ae9a2ef84a342ccdec952795ad9e6595dcb9a9b950cad3ef3dc9dd6410ab3f9f"} Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105189 4694 scope.go:117] "RemoveContainer" containerID="07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.105400 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-896vh" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.144450 4694 scope.go:117] "RemoveContainer" containerID="07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706" Feb 17 16:57:18 crc kubenswrapper[4694]: E0217 16:57:18.146135 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706\": container with ID starting with 07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706 not found: ID does not exist" containerID="07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.146202 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706"} err="failed to get container status \"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706\": rpc error: code = NotFound desc = could not find container \"07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706\": container with ID starting with 07e49b5c64808910e5ededc392af096aacc1ec8210f657f2fc8b005730a5b706 not found: ID does not exist" Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.168263 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.175111 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-896vh"] Feb 17 16:57:18 crc kubenswrapper[4694]: I0217 16:57:18.901356 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" path="/var/lib/kubelet/pods/4387c481-04e8-4060-affe-f9b6fc0b1406/volumes" Feb 17 16:57:19 crc kubenswrapper[4694]: I0217 16:57:19.115363 4694 generic.go:334] "Generic (PLEG): container finished" podID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerID="6a9075e0da00dd9f47ef3ba4f8d6f7799af8870453a21ff04d2cd98de7831a17" exitCode=0 Feb 17 16:57:19 crc kubenswrapper[4694]: I0217 16:57:19.115416 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" event={"ID":"9d0c9969-a494-496f-bc4d-721e0a4ac013","Type":"ContainerDied","Data":"6a9075e0da00dd9f47ef3ba4f8d6f7799af8870453a21ff04d2cd98de7831a17"} Feb 17 16:57:20 crc kubenswrapper[4694]: I0217 16:57:20.127483 4694 generic.go:334] "Generic (PLEG): container finished" podID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerID="e932b0d51a3e694625d39ccb87975549f8f327544d9faac44f8feb2523956f8d" exitCode=0 Feb 17 16:57:20 crc kubenswrapper[4694]: I0217 16:57:20.127529 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" event={"ID":"9d0c9969-a494-496f-bc4d-721e0a4ac013","Type":"ContainerDied","Data":"e932b0d51a3e694625d39ccb87975549f8f327544d9faac44f8feb2523956f8d"} Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.382879 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.477056 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util\") pod \"9d0c9969-a494-496f-bc4d-721e0a4ac013\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.477130 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle\") pod \"9d0c9969-a494-496f-bc4d-721e0a4ac013\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.477308 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbr2s\" (UniqueName: \"kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s\") pod \"9d0c9969-a494-496f-bc4d-721e0a4ac013\" (UID: \"9d0c9969-a494-496f-bc4d-721e0a4ac013\") " Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.479284 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle" (OuterVolumeSpecName: "bundle") pod "9d0c9969-a494-496f-bc4d-721e0a4ac013" (UID: "9d0c9969-a494-496f-bc4d-721e0a4ac013"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.486985 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util" (OuterVolumeSpecName: "util") pod "9d0c9969-a494-496f-bc4d-721e0a4ac013" (UID: "9d0c9969-a494-496f-bc4d-721e0a4ac013"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.487583 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s" (OuterVolumeSpecName: "kube-api-access-dbr2s") pod "9d0c9969-a494-496f-bc4d-721e0a4ac013" (UID: "9d0c9969-a494-496f-bc4d-721e0a4ac013"). InnerVolumeSpecName "kube-api-access-dbr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.578913 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbr2s\" (UniqueName: \"kubernetes.io/projected/9d0c9969-a494-496f-bc4d-721e0a4ac013-kube-api-access-dbr2s\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.578952 4694 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-util\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:21 crc kubenswrapper[4694]: I0217 16:57:21.578964 4694 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d0c9969-a494-496f-bc4d-721e0a4ac013-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:22 crc kubenswrapper[4694]: I0217 16:57:22.146228 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" Feb 17 16:57:22 crc kubenswrapper[4694]: I0217 16:57:22.147109 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv" event={"ID":"9d0c9969-a494-496f-bc4d-721e0a4ac013","Type":"ContainerDied","Data":"2439806dc9b0f96fc098a6aeb7c7234c270ae347aba8e67a056f1d85f4a745ab"} Feb 17 16:57:22 crc kubenswrapper[4694]: I0217 16:57:22.147183 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2439806dc9b0f96fc098a6aeb7c7234c270ae347aba8e67a056f1d85f4a745ab" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.312901 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:28 crc kubenswrapper[4694]: E0217 16:57:28.313764 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="extract" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.313806 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="extract" Feb 17 16:57:28 crc kubenswrapper[4694]: E0217 16:57:28.313840 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.313850 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" Feb 17 16:57:28 crc kubenswrapper[4694]: E0217 16:57:28.313867 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="pull" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.313877 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="pull" Feb 17 16:57:28 crc kubenswrapper[4694]: E0217 16:57:28.313892 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="util" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.313900 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="util" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.314065 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4387c481-04e8-4060-affe-f9b6fc0b1406" containerName="console" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.314102 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0c9969-a494-496f-bc4d-721e0a4ac013" containerName="extract" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.315097 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.334951 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.366892 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.366935 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvczr\" (UniqueName: \"kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.366981 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.468412 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.468518 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.468554 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvczr\" (UniqueName: \"kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.469052 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.469058 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.489447 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvczr\" (UniqueName: \"kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr\") pod \"certified-operators-bs2jq\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.636267 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:28 crc kubenswrapper[4694]: I0217 16:57:28.953435 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:29 crc kubenswrapper[4694]: I0217 16:57:29.184977 4694 generic.go:334] "Generic (PLEG): container finished" podID="bb1cca17-c4a5-497a-8362-fe1568581692" containerID="6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32" exitCode=0 Feb 17 16:57:29 crc kubenswrapper[4694]: I0217 16:57:29.185040 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerDied","Data":"6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32"} Feb 17 16:57:29 crc kubenswrapper[4694]: I0217 16:57:29.185335 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerStarted","Data":"b1ef6ecc1a1a03bfac8853f687893d51dafb7ab6fc209b67a0039ed41d88ee0c"} Feb 17 16:57:30 crc kubenswrapper[4694]: I0217 16:57:30.194623 4694 generic.go:334] "Generic (PLEG): container finished" podID="bb1cca17-c4a5-497a-8362-fe1568581692" containerID="087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8" exitCode=0 Feb 17 16:57:30 crc kubenswrapper[4694]: I0217 16:57:30.194684 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerDied","Data":"087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8"} Feb 17 16:57:31 crc kubenswrapper[4694]: I0217 16:57:31.202217 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerStarted","Data":"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf"} Feb 17 16:57:31 crc kubenswrapper[4694]: I0217 16:57:31.219688 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bs2jq" podStartSLOduration=1.794487843 podStartE2EDuration="3.219669656s" podCreationTimestamp="2026-02-17 16:57:28 +0000 UTC" firstStartedPulling="2026-02-17 16:57:29.186483233 +0000 UTC m=+916.943558557" lastFinishedPulling="2026-02-17 16:57:30.611665046 +0000 UTC m=+918.368740370" observedRunningTime="2026-02-17 16:57:31.218863326 +0000 UTC m=+918.975938670" watchObservedRunningTime="2026-02-17 16:57:31.219669656 +0000 UTC m=+918.976744980" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.429908 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs"] Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.430890 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.432705 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-b29cp" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.433684 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.433948 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.434341 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.447683 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.448902 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs"] Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.552218 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-apiservice-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.552513 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qckvg\" (UniqueName: \"kubernetes.io/projected/152f177f-c542-4114-9d7b-601185b129b2-kube-api-access-qckvg\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.552640 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-webhook-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.648867 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7756f55684-8twln"] Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.649885 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.652140 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-f9zq2" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.653424 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qckvg\" (UniqueName: \"kubernetes.io/projected/152f177f-c542-4114-9d7b-601185b129b2-kube-api-access-qckvg\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.653472 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-webhook-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.653524 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-apiservice-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.655752 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.655803 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.660489 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-apiservice-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.660685 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/152f177f-c542-4114-9d7b-601185b129b2-webhook-cert\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.670902 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7756f55684-8twln"] Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.679430 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qckvg\" (UniqueName: \"kubernetes.io/projected/152f177f-c542-4114-9d7b-601185b129b2-kube-api-access-qckvg\") pod \"metallb-operator-controller-manager-7d47c4d78b-gffhs\" (UID: \"152f177f-c542-4114-9d7b-601185b129b2\") " pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.747294 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.754750 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcmg\" (UniqueName: \"kubernetes.io/projected/d2d8d1c1-ad28-45c6-8314-935e8c60b976-kube-api-access-4vcmg\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.754817 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-webhook-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.754899 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-apiservice-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.856474 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-apiservice-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.856570 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcmg\" (UniqueName: \"kubernetes.io/projected/d2d8d1c1-ad28-45c6-8314-935e8c60b976-kube-api-access-4vcmg\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.856589 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-webhook-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.861420 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-webhook-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.861953 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2d8d1c1-ad28-45c6-8314-935e8c60b976-apiservice-cert\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.875165 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcmg\" (UniqueName: \"kubernetes.io/projected/d2d8d1c1-ad28-45c6-8314-935e8c60b976-kube-api-access-4vcmg\") pod \"metallb-operator-webhook-server-7756f55684-8twln\" (UID: \"d2d8d1c1-ad28-45c6-8314-935e8c60b976\") " pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:34 crc kubenswrapper[4694]: I0217 16:57:34.966531 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs"] Feb 17 16:57:35 crc kubenswrapper[4694]: I0217 16:57:35.006054 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:35 crc kubenswrapper[4694]: I0217 16:57:35.218476 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7756f55684-8twln"] Feb 17 16:57:35 crc kubenswrapper[4694]: I0217 16:57:35.222635 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" event={"ID":"152f177f-c542-4114-9d7b-601185b129b2","Type":"ContainerStarted","Data":"d236f7cccb6902a9b437eac72af18ef649b30d16765a639a6872ac3ada23040e"} Feb 17 16:57:35 crc kubenswrapper[4694]: W0217 16:57:35.226416 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2d8d1c1_ad28_45c6_8314_935e8c60b976.slice/crio-4218161c37ad861d48c12b8d1200e86f65af4dca3242660aad375cfe18fdca2a WatchSource:0}: Error finding container 4218161c37ad861d48c12b8d1200e86f65af4dca3242660aad375cfe18fdca2a: Status 404 returned error can't find the container with id 4218161c37ad861d48c12b8d1200e86f65af4dca3242660aad375cfe18fdca2a Feb 17 16:57:36 crc kubenswrapper[4694]: I0217 16:57:36.239018 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" event={"ID":"d2d8d1c1-ad28-45c6-8314-935e8c60b976","Type":"ContainerStarted","Data":"4218161c37ad861d48c12b8d1200e86f65af4dca3242660aad375cfe18fdca2a"} Feb 17 16:57:38 crc kubenswrapper[4694]: I0217 16:57:38.636411 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:38 crc kubenswrapper[4694]: I0217 16:57:38.636520 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:38 crc kubenswrapper[4694]: I0217 16:57:38.683271 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:39 crc kubenswrapper[4694]: I0217 16:57:39.306948 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.273407 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" event={"ID":"152f177f-c542-4114-9d7b-601185b129b2","Type":"ContainerStarted","Data":"c8aff1e8027d1dcae2741aa72561310a992a12965feed7060d6d3728aab91a0d"} Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.275012 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" event={"ID":"d2d8d1c1-ad28-45c6-8314-935e8c60b976","Type":"ContainerStarted","Data":"68ae8c900d535eaf9648baac439c8a126bf679b392bf453dfc0f838f09fcb44b"} Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.275121 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.275221 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.304741 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.321113 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" podStartSLOduration=1.8942975990000002 podStartE2EDuration="6.321094468s" podCreationTimestamp="2026-02-17 16:57:34 +0000 UTC" firstStartedPulling="2026-02-17 16:57:34.978337083 +0000 UTC m=+922.735412407" lastFinishedPulling="2026-02-17 16:57:39.405133952 +0000 UTC m=+927.162209276" observedRunningTime="2026-02-17 16:57:40.320051202 +0000 UTC m=+928.077126536" watchObservedRunningTime="2026-02-17 16:57:40.321094468 +0000 UTC m=+928.078169812" Feb 17 16:57:40 crc kubenswrapper[4694]: I0217 16:57:40.337814 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" podStartSLOduration=2.143936456 podStartE2EDuration="6.337795227s" podCreationTimestamp="2026-02-17 16:57:34 +0000 UTC" firstStartedPulling="2026-02-17 16:57:35.229268002 +0000 UTC m=+922.986343336" lastFinishedPulling="2026-02-17 16:57:39.423126783 +0000 UTC m=+927.180202107" observedRunningTime="2026-02-17 16:57:40.336412503 +0000 UTC m=+928.093487857" watchObservedRunningTime="2026-02-17 16:57:40.337795227 +0000 UTC m=+928.094870561" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.285932 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bs2jq" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="registry-server" containerID="cri-o://aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf" gracePeriod=2 Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.707019 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.767214 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvczr\" (UniqueName: \"kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr\") pod \"bb1cca17-c4a5-497a-8362-fe1568581692\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.767269 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content\") pod \"bb1cca17-c4a5-497a-8362-fe1568581692\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.767297 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities\") pod \"bb1cca17-c4a5-497a-8362-fe1568581692\" (UID: \"bb1cca17-c4a5-497a-8362-fe1568581692\") " Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.768356 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities" (OuterVolumeSpecName: "utilities") pod "bb1cca17-c4a5-497a-8362-fe1568581692" (UID: "bb1cca17-c4a5-497a-8362-fe1568581692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.772483 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr" (OuterVolumeSpecName: "kube-api-access-fvczr") pod "bb1cca17-c4a5-497a-8362-fe1568581692" (UID: "bb1cca17-c4a5-497a-8362-fe1568581692"). InnerVolumeSpecName "kube-api-access-fvczr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.827914 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb1cca17-c4a5-497a-8362-fe1568581692" (UID: "bb1cca17-c4a5-497a-8362-fe1568581692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.868205 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvczr\" (UniqueName: \"kubernetes.io/projected/bb1cca17-c4a5-497a-8362-fe1568581692-kube-api-access-fvczr\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.868237 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:42 crc kubenswrapper[4694]: I0217 16:57:42.868250 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1cca17-c4a5-497a-8362-fe1568581692-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.294064 4694 generic.go:334] "Generic (PLEG): container finished" podID="bb1cca17-c4a5-497a-8362-fe1568581692" containerID="aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf" exitCode=0 Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.294143 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bs2jq" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.294144 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerDied","Data":"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf"} Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.294235 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bs2jq" event={"ID":"bb1cca17-c4a5-497a-8362-fe1568581692","Type":"ContainerDied","Data":"b1ef6ecc1a1a03bfac8853f687893d51dafb7ab6fc209b67a0039ed41d88ee0c"} Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.294266 4694 scope.go:117] "RemoveContainer" containerID="aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.310917 4694 scope.go:117] "RemoveContainer" containerID="087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.312233 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.322174 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bs2jq"] Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.327910 4694 scope.go:117] "RemoveContainer" containerID="6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.347887 4694 scope.go:117] "RemoveContainer" containerID="aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf" Feb 17 16:57:43 crc kubenswrapper[4694]: E0217 16:57:43.348765 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf\": container with ID starting with aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf not found: ID does not exist" containerID="aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.348808 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf"} err="failed to get container status \"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf\": rpc error: code = NotFound desc = could not find container \"aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf\": container with ID starting with aad234980b3bb08d10d0051583df245b439e25a6587f06072dbcb1192fa049cf not found: ID does not exist" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.348839 4694 scope.go:117] "RemoveContainer" containerID="087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8" Feb 17 16:57:43 crc kubenswrapper[4694]: E0217 16:57:43.349211 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8\": container with ID starting with 087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8 not found: ID does not exist" containerID="087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.349257 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8"} err="failed to get container status \"087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8\": rpc error: code = NotFound desc = could not find container \"087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8\": container with ID starting with 087e1ab4d4eeeab42fa90fdf93c1030cd86f71dfd57b8e27346d4ac78c0b44c8 not found: ID does not exist" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.349292 4694 scope.go:117] "RemoveContainer" containerID="6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32" Feb 17 16:57:43 crc kubenswrapper[4694]: E0217 16:57:43.349745 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32\": container with ID starting with 6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32 not found: ID does not exist" containerID="6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32" Feb 17 16:57:43 crc kubenswrapper[4694]: I0217 16:57:43.349769 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32"} err="failed to get container status \"6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32\": rpc error: code = NotFound desc = could not find container \"6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32\": container with ID starting with 6aac52ca2f7d524e5e0351f16fd2bdb31862c6564b2472966d7f5bf96bb0ed32 not found: ID does not exist" Feb 17 16:57:44 crc kubenswrapper[4694]: I0217 16:57:44.618287 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:57:44 crc kubenswrapper[4694]: I0217 16:57:44.618960 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:57:44 crc kubenswrapper[4694]: I0217 16:57:44.905219 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" path="/var/lib/kubelet/pods/bb1cca17-c4a5-497a-8362-fe1568581692/volumes" Feb 17 16:57:55 crc kubenswrapper[4694]: I0217 16:57:55.013413 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7756f55684-8twln" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.216601 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:14 crc kubenswrapper[4694]: E0217 16:58:14.217461 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="extract-content" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.217475 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="extract-content" Feb 17 16:58:14 crc kubenswrapper[4694]: E0217 16:58:14.217486 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="registry-server" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.217494 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="registry-server" Feb 17 16:58:14 crc kubenswrapper[4694]: E0217 16:58:14.217509 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="extract-utilities" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.217516 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="extract-utilities" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.217663 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1cca17-c4a5-497a-8362-fe1568581692" containerName="registry-server" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.218570 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.253268 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.268186 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.268349 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjbk4\" (UniqueName: \"kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.268386 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.369834 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.369920 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjbk4\" (UniqueName: \"kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.369961 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.370437 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.370449 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.388462 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjbk4\" (UniqueName: \"kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4\") pod \"redhat-marketplace-fzqxh\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.531376 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.617908 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.618017 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.750990 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d47c4d78b-gffhs" Feb 17 16:58:14 crc kubenswrapper[4694]: I0217 16:58:14.757924 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.477195 4694 generic.go:334] "Generic (PLEG): container finished" podID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerID="14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a" exitCode=0 Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.477344 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerDied","Data":"14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a"} Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.477557 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerStarted","Data":"844f88059f2c4d158a5bddd1cb9c31d553233ec0e9f8bf5177184a3e873252ac"} Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.511171 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cmzh5"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.521962 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.522307 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.523158 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.527140 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.527298 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.527407 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-c5pn7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.532113 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.539218 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588319 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-sockets\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588368 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-reloader\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588397 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d3f052-b387-452c-a154-a4f7cd14a6b7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588421 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-startup\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588449 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdf4\" (UniqueName: \"kubernetes.io/projected/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-kube-api-access-vgdf4\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588487 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588512 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmr6b\" (UniqueName: \"kubernetes.io/projected/b1d3f052-b387-452c-a154-a4f7cd14a6b7-kube-api-access-fmr6b\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588533 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.588575 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-conf\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.604193 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xlw27"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.605249 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.609839 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.610031 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.610189 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zdb42" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.613007 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.616851 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-nhxz7"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.617737 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.621908 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.630715 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-nhxz7"] Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689291 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-conf\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689336 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-cert\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689358 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e542a8da-05af-4ddd-95dc-cf10576c4658-metallb-excludel2\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689394 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-reloader\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689408 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-sockets\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689428 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4g7s\" (UniqueName: \"kubernetes.io/projected/13962e8e-e444-4010-912e-9c953c8f7b8f-kube-api-access-w4g7s\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689446 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d3f052-b387-452c-a154-a4f7cd14a6b7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689465 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-startup\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689492 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdf4\" (UniqueName: \"kubernetes.io/projected/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-kube-api-access-vgdf4\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689534 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-metrics-certs\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689551 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwpc\" (UniqueName: \"kubernetes.io/projected/e542a8da-05af-4ddd-95dc-cf10576c4658-kube-api-access-snwpc\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689573 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689623 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmr6b\" (UniqueName: \"kubernetes.io/projected/b1d3f052-b387-452c-a154-a4f7cd14a6b7-kube-api-access-fmr6b\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689642 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-metrics-certs\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689657 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.689672 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.690066 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-conf\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.690246 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-reloader\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.690413 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-sockets\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: E0217 16:58:15.691449 4694 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 17 16:58:15 crc kubenswrapper[4694]: E0217 16:58:15.691505 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs podName:e61de6d5-a385-4bbb-8ddf-b4af95e92b3c nodeName:}" failed. No retries permitted until 2026-02-17 16:58:16.191488909 +0000 UTC m=+963.948564233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs") pod "frr-k8s-cmzh5" (UID: "e61de6d5-a385-4bbb-8ddf-b4af95e92b3c") : secret "frr-k8s-certs-secret" not found Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.692023 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.693861 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-frr-startup\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.696232 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d3f052-b387-452c-a154-a4f7cd14a6b7-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.709576 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdf4\" (UniqueName: \"kubernetes.io/projected/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-kube-api-access-vgdf4\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.714785 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmr6b\" (UniqueName: \"kubernetes.io/projected/b1d3f052-b387-452c-a154-a4f7cd14a6b7-kube-api-access-fmr6b\") pod \"frr-k8s-webhook-server-78b44bf5bb-nzwnk\" (UID: \"b1d3f052-b387-452c-a154-a4f7cd14a6b7\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.790909 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4g7s\" (UniqueName: \"kubernetes.io/projected/13962e8e-e444-4010-912e-9c953c8f7b8f-kube-api-access-w4g7s\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791276 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-metrics-certs\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791306 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwpc\" (UniqueName: \"kubernetes.io/projected/e542a8da-05af-4ddd-95dc-cf10576c4658-kube-api-access-snwpc\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791339 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-metrics-certs\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791367 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: E0217 16:58:15.791512 4694 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 16:58:15 crc kubenswrapper[4694]: E0217 16:58:15.791569 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist podName:e542a8da-05af-4ddd-95dc-cf10576c4658 nodeName:}" failed. No retries permitted until 2026-02-17 16:58:16.291552281 +0000 UTC m=+964.048627605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist") pod "speaker-xlw27" (UID: "e542a8da-05af-4ddd-95dc-cf10576c4658") : secret "metallb-memberlist" not found Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791696 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-cert\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.791723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e542a8da-05af-4ddd-95dc-cf10576c4658-metallb-excludel2\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.792467 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e542a8da-05af-4ddd-95dc-cf10576c4658-metallb-excludel2\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.793793 4694 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.794503 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-metrics-certs\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.794678 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-metrics-certs\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.804784 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13962e8e-e444-4010-912e-9c953c8f7b8f-cert\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.811660 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4g7s\" (UniqueName: \"kubernetes.io/projected/13962e8e-e444-4010-912e-9c953c8f7b8f-kube-api-access-w4g7s\") pod \"controller-69bbfbf88f-nhxz7\" (UID: \"13962e8e-e444-4010-912e-9c953c8f7b8f\") " pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.814941 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwpc\" (UniqueName: \"kubernetes.io/projected/e542a8da-05af-4ddd-95dc-cf10576c4658-kube-api-access-snwpc\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.862559 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:15 crc kubenswrapper[4694]: I0217 16:58:15.939146 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.048378 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk"] Feb 17 16:58:16 crc kubenswrapper[4694]: W0217 16:58:16.068219 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d3f052_b387_452c_a154_a4f7cd14a6b7.slice/crio-02fa214f99f8216402a27d7356965b9c2326398df026a999c26074adbe5fd484 WatchSource:0}: Error finding container 02fa214f99f8216402a27d7356965b9c2326398df026a999c26074adbe5fd484: Status 404 returned error can't find the container with id 02fa214f99f8216402a27d7356965b9c2326398df026a999c26074adbe5fd484 Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.141513 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-nhxz7"] Feb 17 16:58:16 crc kubenswrapper[4694]: W0217 16:58:16.146718 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13962e8e_e444_4010_912e_9c953c8f7b8f.slice/crio-0cdc0d32cbf2048423af5b23a03ab142dd01eaf19fbaa90a9ff4413b493ee465 WatchSource:0}: Error finding container 0cdc0d32cbf2048423af5b23a03ab142dd01eaf19fbaa90a9ff4413b493ee465: Status 404 returned error can't find the container with id 0cdc0d32cbf2048423af5b23a03ab142dd01eaf19fbaa90a9ff4413b493ee465 Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.198742 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.203422 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e61de6d5-a385-4bbb-8ddf-b4af95e92b3c-metrics-certs\") pod \"frr-k8s-cmzh5\" (UID: \"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c\") " pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.300226 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:16 crc kubenswrapper[4694]: E0217 16:58:16.300400 4694 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 16:58:16 crc kubenswrapper[4694]: E0217 16:58:16.300509 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist podName:e542a8da-05af-4ddd-95dc-cf10576c4658 nodeName:}" failed. No retries permitted until 2026-02-17 16:58:17.300481291 +0000 UTC m=+965.057556635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist") pod "speaker-xlw27" (UID: "e542a8da-05af-4ddd-95dc-cf10576c4658") : secret "metallb-memberlist" not found Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.448927 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.485764 4694 generic.go:334] "Generic (PLEG): container finished" podID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerID="e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2" exitCode=0 Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.485826 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerDied","Data":"e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2"} Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.493287 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-nhxz7" event={"ID":"13962e8e-e444-4010-912e-9c953c8f7b8f","Type":"ContainerStarted","Data":"46fa3dedf079270086de93d38ce60635d2667f8b7011be86da7fa57ed2003b59"} Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.493339 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-nhxz7" event={"ID":"13962e8e-e444-4010-912e-9c953c8f7b8f","Type":"ContainerStarted","Data":"799e48903a33c98f9a84d5d2ca7f66ff429f6bb227098a5911cf4fe8281e8977"} Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.493353 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-nhxz7" event={"ID":"13962e8e-e444-4010-912e-9c953c8f7b8f","Type":"ContainerStarted","Data":"0cdc0d32cbf2048423af5b23a03ab142dd01eaf19fbaa90a9ff4413b493ee465"} Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.493373 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.494099 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" event={"ID":"b1d3f052-b387-452c-a154-a4f7cd14a6b7","Type":"ContainerStarted","Data":"02fa214f99f8216402a27d7356965b9c2326398df026a999c26074adbe5fd484"} Feb 17 16:58:16 crc kubenswrapper[4694]: I0217 16:58:16.538193 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-nhxz7" podStartSLOduration=1.5381741660000001 podStartE2EDuration="1.538174166s" podCreationTimestamp="2026-02-17 16:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:58:16.535247864 +0000 UTC m=+964.292323198" watchObservedRunningTime="2026-02-17 16:58:16.538174166 +0000 UTC m=+964.295249490" Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.312170 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.324256 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e542a8da-05af-4ddd-95dc-cf10576c4658-memberlist\") pod \"speaker-xlw27\" (UID: \"e542a8da-05af-4ddd-95dc-cf10576c4658\") " pod="metallb-system/speaker-xlw27" Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.429368 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xlw27" Feb 17 16:58:17 crc kubenswrapper[4694]: W0217 16:58:17.459944 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode542a8da_05af_4ddd_95dc_cf10576c4658.slice/crio-f4ecbbb5ae61daa7003d9eab724513e5cb0d19b7653bb14d24c8e8f310a801fd WatchSource:0}: Error finding container f4ecbbb5ae61daa7003d9eab724513e5cb0d19b7653bb14d24c8e8f310a801fd: Status 404 returned error can't find the container with id f4ecbbb5ae61daa7003d9eab724513e5cb0d19b7653bb14d24c8e8f310a801fd Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.501662 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xlw27" event={"ID":"e542a8da-05af-4ddd-95dc-cf10576c4658","Type":"ContainerStarted","Data":"f4ecbbb5ae61daa7003d9eab724513e5cb0d19b7653bb14d24c8e8f310a801fd"} Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.504122 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerStarted","Data":"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14"} Feb 17 16:58:17 crc kubenswrapper[4694]: I0217 16:58:17.506723 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"f29b19b94e86c2d3a4e78e50eb51b4e456300de1f2a62adaf2ee03ffacc1cb42"} Feb 17 16:58:18 crc kubenswrapper[4694]: I0217 16:58:18.515313 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xlw27" event={"ID":"e542a8da-05af-4ddd-95dc-cf10576c4658","Type":"ContainerStarted","Data":"d08a6bd9cfe0a730942b13da43120391dc1684bab3d94e3b719d658a1b1c777f"} Feb 17 16:58:18 crc kubenswrapper[4694]: I0217 16:58:18.515671 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xlw27" event={"ID":"e542a8da-05af-4ddd-95dc-cf10576c4658","Type":"ContainerStarted","Data":"8264f03de64a1242a25d6aca2cef563139138b51ec87bd7ba4c7001a0a9d978b"} Feb 17 16:58:18 crc kubenswrapper[4694]: I0217 16:58:18.535259 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fzqxh" podStartSLOduration=3.12308401 podStartE2EDuration="4.535237055s" podCreationTimestamp="2026-02-17 16:58:14 +0000 UTC" firstStartedPulling="2026-02-17 16:58:15.478667813 +0000 UTC m=+963.235743137" lastFinishedPulling="2026-02-17 16:58:16.890820848 +0000 UTC m=+964.647896182" observedRunningTime="2026-02-17 16:58:17.533402265 +0000 UTC m=+965.290477589" watchObservedRunningTime="2026-02-17 16:58:18.535237055 +0000 UTC m=+966.292312379" Feb 17 16:58:18 crc kubenswrapper[4694]: I0217 16:58:18.537154 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xlw27" podStartSLOduration=3.537147302 podStartE2EDuration="3.537147302s" podCreationTimestamp="2026-02-17 16:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 16:58:18.531011612 +0000 UTC m=+966.288086926" watchObservedRunningTime="2026-02-17 16:58:18.537147302 +0000 UTC m=+966.294222636" Feb 17 16:58:19 crc kubenswrapper[4694]: I0217 16:58:19.521990 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xlw27" Feb 17 16:58:23 crc kubenswrapper[4694]: I0217 16:58:23.561517 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" event={"ID":"b1d3f052-b387-452c-a154-a4f7cd14a6b7","Type":"ContainerStarted","Data":"52174290066e4d5f5825789f8be4bbebc5ec65bff5cd53a64b4aa720235cff84"} Feb 17 16:58:23 crc kubenswrapper[4694]: I0217 16:58:23.562085 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:23 crc kubenswrapper[4694]: I0217 16:58:23.564728 4694 generic.go:334] "Generic (PLEG): container finished" podID="e61de6d5-a385-4bbb-8ddf-b4af95e92b3c" containerID="3e44706b153d2727ebccdb5c572743819c700507b9c2478663779f34e0c530cd" exitCode=0 Feb 17 16:58:23 crc kubenswrapper[4694]: I0217 16:58:23.564765 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerDied","Data":"3e44706b153d2727ebccdb5c572743819c700507b9c2478663779f34e0c530cd"} Feb 17 16:58:23 crc kubenswrapper[4694]: I0217 16:58:23.585054 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" podStartSLOduration=1.575731046 podStartE2EDuration="8.585037171s" podCreationTimestamp="2026-02-17 16:58:15 +0000 UTC" firstStartedPulling="2026-02-17 16:58:16.07431021 +0000 UTC m=+963.831385534" lastFinishedPulling="2026-02-17 16:58:23.083616335 +0000 UTC m=+970.840691659" observedRunningTime="2026-02-17 16:58:23.584955389 +0000 UTC m=+971.342030723" watchObservedRunningTime="2026-02-17 16:58:23.585037171 +0000 UTC m=+971.342112495" Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.532837 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.532893 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.581255 4694 generic.go:334] "Generic (PLEG): container finished" podID="e61de6d5-a385-4bbb-8ddf-b4af95e92b3c" containerID="b242ca733e18ac8e7dcf7b82d914a517d939338de9a751ed3232ccb5c1cdeadb" exitCode=0 Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.581298 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerDied","Data":"b242ca733e18ac8e7dcf7b82d914a517d939338de9a751ed3232ccb5c1cdeadb"} Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.649851 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.705747 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:24 crc kubenswrapper[4694]: E0217 16:58:24.832765 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61de6d5_a385_4bbb_8ddf_b4af95e92b3c.slice/crio-conmon-97024317b614e477bb698bf6fd73d0924fb05e736dce199054deeefec47ab498.scope\": RecentStats: unable to find data in memory cache]" Feb 17 16:58:24 crc kubenswrapper[4694]: I0217 16:58:24.906143 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:25 crc kubenswrapper[4694]: I0217 16:58:25.591016 4694 generic.go:334] "Generic (PLEG): container finished" podID="e61de6d5-a385-4bbb-8ddf-b4af95e92b3c" containerID="97024317b614e477bb698bf6fd73d0924fb05e736dce199054deeefec47ab498" exitCode=0 Feb 17 16:58:25 crc kubenswrapper[4694]: I0217 16:58:25.591131 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerDied","Data":"97024317b614e477bb698bf6fd73d0924fb05e736dce199054deeefec47ab498"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.620589 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fzqxh" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="registry-server" containerID="cri-o://0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14" gracePeriod=2 Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621205 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"f23a362cd336539682c3f99ca721328d0e89bf65bbc70c196f839505a9c7bf4b"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621235 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"dfb7347f91518c2f00291f0ca6f40657b5322131facca17beec5502c359047ec"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621245 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"6479eb285ee31a3e80bb1eb5170f8a977a72cfc3c528d9b6e887f065fe561b9c"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621254 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"9b9db9478d160f384017a1222fce613aafb97c7fe0e29740a9d1927b08aa980b"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621275 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"046ade81be0a500bccb264abbd48162cdcc799887b602a66eab6b50f46467248"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621284 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cmzh5" event={"ID":"e61de6d5-a385-4bbb-8ddf-b4af95e92b3c","Type":"ContainerStarted","Data":"bec4f92634557a05a372129a354dab7f66ed1480e1c3796d740d1293ed2d6dbd"} Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.621383 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:26 crc kubenswrapper[4694]: I0217 16:58:26.655325 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cmzh5" podStartSLOduration=5.175776534 podStartE2EDuration="11.655310127s" podCreationTimestamp="2026-02-17 16:58:15 +0000 UTC" firstStartedPulling="2026-02-17 16:58:16.629688709 +0000 UTC m=+964.386764033" lastFinishedPulling="2026-02-17 16:58:23.109222302 +0000 UTC m=+970.866297626" observedRunningTime="2026-02-17 16:58:26.653229946 +0000 UTC m=+974.410305290" watchObservedRunningTime="2026-02-17 16:58:26.655310127 +0000 UTC m=+974.412385451" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.237130 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.365057 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities\") pod \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.365209 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjbk4\" (UniqueName: \"kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4\") pod \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.365240 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content\") pod \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\" (UID: \"d95c3f53-9a33-42a3-8413-2b4fbe620bc7\") " Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.366295 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities" (OuterVolumeSpecName: "utilities") pod "d95c3f53-9a33-42a3-8413-2b4fbe620bc7" (UID: "d95c3f53-9a33-42a3-8413-2b4fbe620bc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.370905 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4" (OuterVolumeSpecName: "kube-api-access-jjbk4") pod "d95c3f53-9a33-42a3-8413-2b4fbe620bc7" (UID: "d95c3f53-9a33-42a3-8413-2b4fbe620bc7"). InnerVolumeSpecName "kube-api-access-jjbk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.391096 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d95c3f53-9a33-42a3-8413-2b4fbe620bc7" (UID: "d95c3f53-9a33-42a3-8413-2b4fbe620bc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.433975 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xlw27" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.467400 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.467672 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.467761 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjbk4\" (UniqueName: \"kubernetes.io/projected/d95c3f53-9a33-42a3-8413-2b4fbe620bc7-kube-api-access-jjbk4\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.629004 4694 generic.go:334] "Generic (PLEG): container finished" podID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerID="0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14" exitCode=0 Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.629075 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzqxh" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.629096 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerDied","Data":"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14"} Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.629141 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzqxh" event={"ID":"d95c3f53-9a33-42a3-8413-2b4fbe620bc7","Type":"ContainerDied","Data":"844f88059f2c4d158a5bddd1cb9c31d553233ec0e9f8bf5177184a3e873252ac"} Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.629161 4694 scope.go:117] "RemoveContainer" containerID="0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.645537 4694 scope.go:117] "RemoveContainer" containerID="e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.668138 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.669691 4694 scope.go:117] "RemoveContainer" containerID="14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.673194 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzqxh"] Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.682737 4694 scope.go:117] "RemoveContainer" containerID="0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14" Feb 17 16:58:27 crc kubenswrapper[4694]: E0217 16:58:27.683198 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14\": container with ID starting with 0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14 not found: ID does not exist" containerID="0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.683305 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14"} err="failed to get container status \"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14\": rpc error: code = NotFound desc = could not find container \"0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14\": container with ID starting with 0f6f519b1e95619ac7c7946ada643010668ebe51e47e77ca5784c08c35d66a14 not found: ID does not exist" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.683387 4694 scope.go:117] "RemoveContainer" containerID="e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2" Feb 17 16:58:27 crc kubenswrapper[4694]: E0217 16:58:27.683800 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2\": container with ID starting with e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2 not found: ID does not exist" containerID="e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.683837 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2"} err="failed to get container status \"e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2\": rpc error: code = NotFound desc = could not find container \"e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2\": container with ID starting with e4ff54e5e81881975a1aaba60ed5869e1e868ecbb7794e66316bbace2f6bc9b2 not found: ID does not exist" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.683863 4694 scope.go:117] "RemoveContainer" containerID="14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a" Feb 17 16:58:27 crc kubenswrapper[4694]: E0217 16:58:27.684172 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a\": container with ID starting with 14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a not found: ID does not exist" containerID="14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a" Feb 17 16:58:27 crc kubenswrapper[4694]: I0217 16:58:27.684201 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a"} err="failed to get container status \"14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a\": rpc error: code = NotFound desc = could not find container \"14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a\": container with ID starting with 14650462db85755e8cf3cc3d702a8097c1a31e46d88d821243c9483ce8b1e63a not found: ID does not exist" Feb 17 16:58:28 crc kubenswrapper[4694]: I0217 16:58:28.902975 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" path="/var/lib/kubelet/pods/d95c3f53-9a33-42a3-8413-2b4fbe620bc7/volumes" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.278750 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:30 crc kubenswrapper[4694]: E0217 16:58:30.279335 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="extract-utilities" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.279351 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="extract-utilities" Feb 17 16:58:30 crc kubenswrapper[4694]: E0217 16:58:30.279375 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="extract-content" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.279384 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="extract-content" Feb 17 16:58:30 crc kubenswrapper[4694]: E0217 16:58:30.279414 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="registry-server" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.279424 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="registry-server" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.279739 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95c3f53-9a33-42a3-8413-2b4fbe620bc7" containerName="registry-server" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.280544 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.286705 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.304583 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2dg5b" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.304866 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.305046 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.406561 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzxbs\" (UniqueName: \"kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs\") pod \"openstack-operator-index-ggwdp\" (UID: \"58f81a9e-ea64-4230-9948-48b4bec276d7\") " pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.508447 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzxbs\" (UniqueName: \"kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs\") pod \"openstack-operator-index-ggwdp\" (UID: \"58f81a9e-ea64-4230-9948-48b4bec276d7\") " pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.531586 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzxbs\" (UniqueName: \"kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs\") pod \"openstack-operator-index-ggwdp\" (UID: \"58f81a9e-ea64-4230-9948-48b4bec276d7\") " pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:30 crc kubenswrapper[4694]: I0217 16:58:30.644167 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:31 crc kubenswrapper[4694]: I0217 16:58:31.026375 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:31 crc kubenswrapper[4694]: W0217 16:58:31.033680 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58f81a9e_ea64_4230_9948_48b4bec276d7.slice/crio-a416668fbca461f5d7539b087fb1938fbfd6f8a81b20e1203308e01c2f34ac48 WatchSource:0}: Error finding container a416668fbca461f5d7539b087fb1938fbfd6f8a81b20e1203308e01c2f34ac48: Status 404 returned error can't find the container with id a416668fbca461f5d7539b087fb1938fbfd6f8a81b20e1203308e01c2f34ac48 Feb 17 16:58:31 crc kubenswrapper[4694]: I0217 16:58:31.450146 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:31 crc kubenswrapper[4694]: I0217 16:58:31.499174 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:31 crc kubenswrapper[4694]: I0217 16:58:31.657235 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ggwdp" event={"ID":"58f81a9e-ea64-4230-9948-48b4bec276d7","Type":"ContainerStarted","Data":"a416668fbca461f5d7539b087fb1938fbfd6f8a81b20e1203308e01c2f34ac48"} Feb 17 16:58:33 crc kubenswrapper[4694]: I0217 16:58:33.674313 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ggwdp" event={"ID":"58f81a9e-ea64-4230-9948-48b4bec276d7","Type":"ContainerStarted","Data":"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378"} Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.104549 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ggwdp" podStartSLOduration=2.22224887 podStartE2EDuration="4.104531495s" podCreationTimestamp="2026-02-17 16:58:30 +0000 UTC" firstStartedPulling="2026-02-17 16:58:31.038932391 +0000 UTC m=+978.796007715" lastFinishedPulling="2026-02-17 16:58:32.921214986 +0000 UTC m=+980.678290340" observedRunningTime="2026-02-17 16:58:33.69801476 +0000 UTC m=+981.455090084" watchObservedRunningTime="2026-02-17 16:58:34.104531495 +0000 UTC m=+981.861606819" Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.105181 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.720915 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hlsbt"] Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.723231 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.732488 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hlsbt"] Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.765675 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6kl\" (UniqueName: \"kubernetes.io/projected/6022b0aa-9d87-47ae-8e99-4f71ef252803-kube-api-access-7w6kl\") pod \"openstack-operator-index-hlsbt\" (UID: \"6022b0aa-9d87-47ae-8e99-4f71ef252803\") " pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.866765 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6kl\" (UniqueName: \"kubernetes.io/projected/6022b0aa-9d87-47ae-8e99-4f71ef252803-kube-api-access-7w6kl\") pod \"openstack-operator-index-hlsbt\" (UID: \"6022b0aa-9d87-47ae-8e99-4f71ef252803\") " pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:34 crc kubenswrapper[4694]: I0217 16:58:34.902875 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6kl\" (UniqueName: \"kubernetes.io/projected/6022b0aa-9d87-47ae-8e99-4f71ef252803-kube-api-access-7w6kl\") pod \"openstack-operator-index-hlsbt\" (UID: \"6022b0aa-9d87-47ae-8e99-4f71ef252803\") " pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.050053 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.299965 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hlsbt"] Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.689585 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hlsbt" event={"ID":"6022b0aa-9d87-47ae-8e99-4f71ef252803","Type":"ContainerStarted","Data":"99716180b0d7efd48df208144d83924d38737abc058ab75dc98fd7227e443246"} Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.689992 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hlsbt" event={"ID":"6022b0aa-9d87-47ae-8e99-4f71ef252803","Type":"ContainerStarted","Data":"c7abcef167799a7284055764b077920f0e1fa791cdb195e7855fab770cdd51ca"} Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.689697 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ggwdp" podUID="58f81a9e-ea64-4230-9948-48b4bec276d7" containerName="registry-server" containerID="cri-o://2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378" gracePeriod=2 Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.871127 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-nzwnk" Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.888622 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hlsbt" podStartSLOduration=1.834857704 podStartE2EDuration="1.88855685s" podCreationTimestamp="2026-02-17 16:58:34 +0000 UTC" firstStartedPulling="2026-02-17 16:58:35.309144716 +0000 UTC m=+983.066220040" lastFinishedPulling="2026-02-17 16:58:35.362843872 +0000 UTC m=+983.119919186" observedRunningTime="2026-02-17 16:58:35.70826643 +0000 UTC m=+983.465341754" watchObservedRunningTime="2026-02-17 16:58:35.88855685 +0000 UTC m=+983.645632184" Feb 17 16:58:35 crc kubenswrapper[4694]: I0217 16:58:35.945037 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-nhxz7" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.089314 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.184994 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzxbs\" (UniqueName: \"kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs\") pod \"58f81a9e-ea64-4230-9948-48b4bec276d7\" (UID: \"58f81a9e-ea64-4230-9948-48b4bec276d7\") " Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.194243 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs" (OuterVolumeSpecName: "kube-api-access-bzxbs") pod "58f81a9e-ea64-4230-9948-48b4bec276d7" (UID: "58f81a9e-ea64-4230-9948-48b4bec276d7"). InnerVolumeSpecName "kube-api-access-bzxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.286641 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzxbs\" (UniqueName: \"kubernetes.io/projected/58f81a9e-ea64-4230-9948-48b4bec276d7-kube-api-access-bzxbs\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.453177 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cmzh5" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.697930 4694 generic.go:334] "Generic (PLEG): container finished" podID="58f81a9e-ea64-4230-9948-48b4bec276d7" containerID="2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378" exitCode=0 Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.698505 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ggwdp" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.698736 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ggwdp" event={"ID":"58f81a9e-ea64-4230-9948-48b4bec276d7","Type":"ContainerDied","Data":"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378"} Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.698839 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ggwdp" event={"ID":"58f81a9e-ea64-4230-9948-48b4bec276d7","Type":"ContainerDied","Data":"a416668fbca461f5d7539b087fb1938fbfd6f8a81b20e1203308e01c2f34ac48"} Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.698892 4694 scope.go:117] "RemoveContainer" containerID="2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.718336 4694 scope.go:117] "RemoveContainer" containerID="2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378" Feb 17 16:58:36 crc kubenswrapper[4694]: E0217 16:58:36.719042 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378\": container with ID starting with 2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378 not found: ID does not exist" containerID="2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.719107 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378"} err="failed to get container status \"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378\": rpc error: code = NotFound desc = could not find container \"2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378\": container with ID starting with 2d771bc134f4314b4dd4c16a54d1a172ad76a19d154f6e6751bee9cdea7ef378 not found: ID does not exist" Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.728033 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.743433 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ggwdp"] Feb 17 16:58:36 crc kubenswrapper[4694]: I0217 16:58:36.909159 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f81a9e-ea64-4230-9948-48b4bec276d7" path="/var/lib/kubelet/pods/58f81a9e-ea64-4230-9948-48b4bec276d7/volumes" Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.618505 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.619188 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.619241 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.619741 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.619803 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107" gracePeriod=600 Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.757330 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107" exitCode=0 Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.757408 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107"} Feb 17 16:58:44 crc kubenswrapper[4694]: I0217 16:58:44.758032 4694 scope.go:117] "RemoveContainer" containerID="47802caf4da2d01def887b2a300cd0debb1b3b1e63a218a62fa742a467a1bdb3" Feb 17 16:58:45 crc kubenswrapper[4694]: I0217 16:58:45.051170 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:45 crc kubenswrapper[4694]: I0217 16:58:45.051436 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:45 crc kubenswrapper[4694]: I0217 16:58:45.073661 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:45 crc kubenswrapper[4694]: I0217 16:58:45.767715 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658"} Feb 17 16:58:45 crc kubenswrapper[4694]: I0217 16:58:45.796112 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hlsbt" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.648897 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf"] Feb 17 16:58:52 crc kubenswrapper[4694]: E0217 16:58:52.649898 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f81a9e-ea64-4230-9948-48b4bec276d7" containerName="registry-server" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.649916 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f81a9e-ea64-4230-9948-48b4bec276d7" containerName="registry-server" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.650053 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f81a9e-ea64-4230-9948-48b4bec276d7" containerName="registry-server" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.650955 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.656247 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zvzp7" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.658429 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf"] Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.719873 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttq8w\" (UniqueName: \"kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.719942 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.719983 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.821116 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttq8w\" (UniqueName: \"kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.821178 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.821215 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.821722 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.821764 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:52 crc kubenswrapper[4694]: I0217 16:58:52.842640 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttq8w\" (UniqueName: \"kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w\") pod \"c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:53 crc kubenswrapper[4694]: I0217 16:58:53.003231 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:53 crc kubenswrapper[4694]: I0217 16:58:53.465539 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf"] Feb 17 16:58:53 crc kubenswrapper[4694]: I0217 16:58:53.819245 4694 generic.go:334] "Generic (PLEG): container finished" podID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerID="ffaa0fad6cf7f16644fd8607a6cffe3365ca9e25279af809a1625558d29ab8dc" exitCode=0 Feb 17 16:58:53 crc kubenswrapper[4694]: I0217 16:58:53.819301 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" event={"ID":"1ee81c0e-d67b-4713-bdb1-62d8092358ec","Type":"ContainerDied","Data":"ffaa0fad6cf7f16644fd8607a6cffe3365ca9e25279af809a1625558d29ab8dc"} Feb 17 16:58:53 crc kubenswrapper[4694]: I0217 16:58:53.819549 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" event={"ID":"1ee81c0e-d67b-4713-bdb1-62d8092358ec","Type":"ContainerStarted","Data":"1965ec5ffe8d460bb160f3661f17bc9627721127eb67b73814860191cb713240"} Feb 17 16:58:54 crc kubenswrapper[4694]: I0217 16:58:54.828581 4694 generic.go:334] "Generic (PLEG): container finished" podID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerID="4f9859b25d2fd46b7c412264715c80bfc289852c22043c839518dbdfd394646a" exitCode=0 Feb 17 16:58:54 crc kubenswrapper[4694]: I0217 16:58:54.828643 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" event={"ID":"1ee81c0e-d67b-4713-bdb1-62d8092358ec","Type":"ContainerDied","Data":"4f9859b25d2fd46b7c412264715c80bfc289852c22043c839518dbdfd394646a"} Feb 17 16:58:55 crc kubenswrapper[4694]: I0217 16:58:55.838841 4694 generic.go:334] "Generic (PLEG): container finished" podID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerID="e56e4f9d56a07e8b69bb0c20aa31d2ca22c3641cbb1a321a9464b3fdbc0433ce" exitCode=0 Feb 17 16:58:55 crc kubenswrapper[4694]: I0217 16:58:55.838965 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" event={"ID":"1ee81c0e-d67b-4713-bdb1-62d8092358ec","Type":"ContainerDied","Data":"e56e4f9d56a07e8b69bb0c20aa31d2ca22c3641cbb1a321a9464b3fdbc0433ce"} Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.179661 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.279428 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle\") pod \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.279487 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttq8w\" (UniqueName: \"kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w\") pod \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.279572 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util\") pod \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\" (UID: \"1ee81c0e-d67b-4713-bdb1-62d8092358ec\") " Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.280285 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle" (OuterVolumeSpecName: "bundle") pod "1ee81c0e-d67b-4713-bdb1-62d8092358ec" (UID: "1ee81c0e-d67b-4713-bdb1-62d8092358ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.287539 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w" (OuterVolumeSpecName: "kube-api-access-ttq8w") pod "1ee81c0e-d67b-4713-bdb1-62d8092358ec" (UID: "1ee81c0e-d67b-4713-bdb1-62d8092358ec"). InnerVolumeSpecName "kube-api-access-ttq8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.293263 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util" (OuterVolumeSpecName: "util") pod "1ee81c0e-d67b-4713-bdb1-62d8092358ec" (UID: "1ee81c0e-d67b-4713-bdb1-62d8092358ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.381506 4694 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-util\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.381534 4694 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ee81c0e-d67b-4713-bdb1-62d8092358ec-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.381547 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttq8w\" (UniqueName: \"kubernetes.io/projected/1ee81c0e-d67b-4713-bdb1-62d8092358ec-kube-api-access-ttq8w\") on node \"crc\" DevicePath \"\"" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.856088 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" event={"ID":"1ee81c0e-d67b-4713-bdb1-62d8092358ec","Type":"ContainerDied","Data":"1965ec5ffe8d460bb160f3661f17bc9627721127eb67b73814860191cb713240"} Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.856119 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1965ec5ffe8d460bb160f3661f17bc9627721127eb67b73814860191cb713240" Feb 17 16:58:57 crc kubenswrapper[4694]: I0217 16:58:57.856219 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.660998 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj"] Feb 17 16:59:04 crc kubenswrapper[4694]: E0217 16:59:04.661705 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="pull" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.661719 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="pull" Feb 17 16:59:04 crc kubenswrapper[4694]: E0217 16:59:04.661749 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="util" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.661757 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="util" Feb 17 16:59:04 crc kubenswrapper[4694]: E0217 16:59:04.661775 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="extract" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.661783 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="extract" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.661905 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee81c0e-d67b-4713-bdb1-62d8092358ec" containerName="extract" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.662402 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.664782 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7f6jk" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.680458 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj"] Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.733332 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknn4\" (UniqueName: \"kubernetes.io/projected/8096d5be-2884-4a45-839b-1b2b20bc116d-kube-api-access-kknn4\") pod \"openstack-operator-controller-init-5f8bcb546f-d72cj\" (UID: \"8096d5be-2884-4a45-839b-1b2b20bc116d\") " pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.834176 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknn4\" (UniqueName: \"kubernetes.io/projected/8096d5be-2884-4a45-839b-1b2b20bc116d-kube-api-access-kknn4\") pod \"openstack-operator-controller-init-5f8bcb546f-d72cj\" (UID: \"8096d5be-2884-4a45-839b-1b2b20bc116d\") " pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.857512 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknn4\" (UniqueName: \"kubernetes.io/projected/8096d5be-2884-4a45-839b-1b2b20bc116d-kube-api-access-kknn4\") pod \"openstack-operator-controller-init-5f8bcb546f-d72cj\" (UID: \"8096d5be-2884-4a45-839b-1b2b20bc116d\") " pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:04 crc kubenswrapper[4694]: I0217 16:59:04.986284 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:05 crc kubenswrapper[4694]: I0217 16:59:05.192491 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj"] Feb 17 16:59:05 crc kubenswrapper[4694]: I0217 16:59:05.912830 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" event={"ID":"8096d5be-2884-4a45-839b-1b2b20bc116d","Type":"ContainerStarted","Data":"d3164e5a1e1b286e6ee043ac0c77737ef1a99dee8be50a87f8614d7aeba54b00"} Feb 17 16:59:08 crc kubenswrapper[4694]: I0217 16:59:08.939899 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" event={"ID":"8096d5be-2884-4a45-839b-1b2b20bc116d","Type":"ContainerStarted","Data":"832e52e49ccbd0b04e990749b4dfc4c922783cc656a93334b2aaa1ef72cab466"} Feb 17 16:59:08 crc kubenswrapper[4694]: I0217 16:59:08.940548 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:08 crc kubenswrapper[4694]: I0217 16:59:08.969828 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" podStartSLOduration=1.441825328 podStartE2EDuration="4.969806996s" podCreationTimestamp="2026-02-17 16:59:04 +0000 UTC" firstStartedPulling="2026-02-17 16:59:05.203169667 +0000 UTC m=+1012.960244991" lastFinishedPulling="2026-02-17 16:59:08.731151335 +0000 UTC m=+1016.488226659" observedRunningTime="2026-02-17 16:59:08.966668049 +0000 UTC m=+1016.723743383" watchObservedRunningTime="2026-02-17 16:59:08.969806996 +0000 UTC m=+1016.726882340" Feb 17 16:59:15 crc kubenswrapper[4694]: I0217 16:59:15.004289 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5f8bcb546f-d72cj" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.448009 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.449202 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.451007 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mvmpb" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.463059 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.463802 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.466024 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2v9mc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.468046 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.476753 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29cv\" (UniqueName: \"kubernetes.io/projected/77c2abe7-0f53-4b11-932c-1f767a6d21b2-kube-api-access-q29cv\") pod \"barbican-operator-controller-manager-c4b7d6946-xc6sk\" (UID: \"77c2abe7-0f53-4b11-932c-1f767a6d21b2\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.476808 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsrgf\" (UniqueName: \"kubernetes.io/projected/b43c51f9-625e-4499-bdcc-612213a353df-kube-api-access-rsrgf\") pod \"cinder-operator-controller-manager-57746b5ff9-fk2xd\" (UID: \"b43c51f9-625e-4499-bdcc-612213a353df\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.490098 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.490821 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.494335 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pth6g" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.501749 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.549833 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.586280 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29cv\" (UniqueName: \"kubernetes.io/projected/77c2abe7-0f53-4b11-932c-1f767a6d21b2-kube-api-access-q29cv\") pod \"barbican-operator-controller-manager-c4b7d6946-xc6sk\" (UID: \"77c2abe7-0f53-4b11-932c-1f767a6d21b2\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.586527 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsrgf\" (UniqueName: \"kubernetes.io/projected/b43c51f9-625e-4499-bdcc-612213a353df-kube-api-access-rsrgf\") pod \"cinder-operator-controller-manager-57746b5ff9-fk2xd\" (UID: \"b43c51f9-625e-4499-bdcc-612213a353df\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.591409 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.592431 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.603080 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-k7wxf" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.607568 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.608553 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.615777 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.616192 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wq48t" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.616573 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.621278 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.627733 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9lsgf" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.633426 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29cv\" (UniqueName: \"kubernetes.io/projected/77c2abe7-0f53-4b11-932c-1f767a6d21b2-kube-api-access-q29cv\") pod \"barbican-operator-controller-manager-c4b7d6946-xc6sk\" (UID: \"77c2abe7-0f53-4b11-932c-1f767a6d21b2\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.634206 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsrgf\" (UniqueName: \"kubernetes.io/projected/b43c51f9-625e-4499-bdcc-612213a353df-kube-api-access-rsrgf\") pod \"cinder-operator-controller-manager-57746b5ff9-fk2xd\" (UID: \"b43c51f9-625e-4499-bdcc-612213a353df\") " pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.639128 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.643474 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.656225 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.656950 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.662189 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.665148 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6mbwl" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.665341 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.666142 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.671566 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kf6ps" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.688024 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7p8r\" (UniqueName: \"kubernetes.io/projected/6df3483a-cb85-4da5-a314-e0aea8874af8-kube-api-access-b7p8r\") pod \"designate-operator-controller-manager-55cc45767f-dkblx\" (UID: \"6df3483a-cb85-4da5-a314-e0aea8874af8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.712970 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.715542 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.723266 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tlr9d" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.723816 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.731018 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.746819 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.747835 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.752951 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tmch6" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.771829 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.776690 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.788385 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.788945 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8vn\" (UniqueName: \"kubernetes.io/projected/17fc1c20-dd01-4529-99a2-5a758dd7d8f1-kube-api-access-pl8vn\") pod \"heat-operator-controller-manager-9595d6797-cc8kn\" (UID: \"17fc1c20-dd01-4529-99a2-5a758dd7d8f1\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.788995 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxr4c\" (UniqueName: \"kubernetes.io/projected/73478819-b2d9-484f-8f31-12636ce0fff1-kube-api-access-rxr4c\") pod \"ironic-operator-controller-manager-6494cdbf8f-5zb28\" (UID: \"73478819-b2d9-484f-8f31-12636ce0fff1\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789019 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4wc\" (UniqueName: \"kubernetes.io/projected/37924017-8b0a-4920-becb-89e528139e25-kube-api-access-gs4wc\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789063 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789100 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fj9b\" (UniqueName: \"kubernetes.io/projected/4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd-kube-api-access-4fj9b\") pod \"glance-operator-controller-manager-68c6d499cb-pkrtr\" (UID: \"4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789137 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7p8r\" (UniqueName: \"kubernetes.io/projected/6df3483a-cb85-4da5-a314-e0aea8874af8-kube-api-access-b7p8r\") pod \"designate-operator-controller-manager-55cc45767f-dkblx\" (UID: \"6df3483a-cb85-4da5-a314-e0aea8874af8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789160 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75b7\" (UniqueName: \"kubernetes.io/projected/9690ac41-b444-4d39-adfb-de4dd1b4d581-kube-api-access-v75b7\") pod \"horizon-operator-controller-manager-54fb488b88-5s57w\" (UID: \"9690ac41-b444-4d39-adfb-de4dd1b4d581\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.789424 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.790302 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.812344 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4q24f" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.822212 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.850771 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7p8r\" (UniqueName: \"kubernetes.io/projected/6df3483a-cb85-4da5-a314-e0aea8874af8-kube-api-access-b7p8r\") pod \"designate-operator-controller-manager-55cc45767f-dkblx\" (UID: \"6df3483a-cb85-4da5-a314-e0aea8874af8\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.850859 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.860320 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.861401 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.866325 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-df4xf" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.891975 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fj9b\" (UniqueName: \"kubernetes.io/projected/4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd-kube-api-access-4fj9b\") pod \"glance-operator-controller-manager-68c6d499cb-pkrtr\" (UID: \"4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892063 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75b7\" (UniqueName: \"kubernetes.io/projected/9690ac41-b444-4d39-adfb-de4dd1b4d581-kube-api-access-v75b7\") pod \"horizon-operator-controller-manager-54fb488b88-5s57w\" (UID: \"9690ac41-b444-4d39-adfb-de4dd1b4d581\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892097 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8vn\" (UniqueName: \"kubernetes.io/projected/17fc1c20-dd01-4529-99a2-5a758dd7d8f1-kube-api-access-pl8vn\") pod \"heat-operator-controller-manager-9595d6797-cc8kn\" (UID: \"17fc1c20-dd01-4529-99a2-5a758dd7d8f1\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892127 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5bh\" (UniqueName: \"kubernetes.io/projected/42a9eede-5ff2-40da-9491-44a8012320a2-kube-api-access-hm5bh\") pod \"keystone-operator-controller-manager-6c78d668d5-wj245\" (UID: \"42a9eede-5ff2-40da-9491-44a8012320a2\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892164 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w96mf\" (UniqueName: \"kubernetes.io/projected/80223b5f-ef20-4fc6-b4bc-b8d63046db41-kube-api-access-w96mf\") pod \"mariadb-operator-controller-manager-66997756f6-z7ldj\" (UID: \"80223b5f-ef20-4fc6-b4bc-b8d63046db41\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892201 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxr4c\" (UniqueName: \"kubernetes.io/projected/73478819-b2d9-484f-8f31-12636ce0fff1-kube-api-access-rxr4c\") pod \"ironic-operator-controller-manager-6494cdbf8f-5zb28\" (UID: \"73478819-b2d9-484f-8f31-12636ce0fff1\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892228 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4wc\" (UniqueName: \"kubernetes.io/projected/37924017-8b0a-4920-becb-89e528139e25-kube-api-access-gs4wc\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892289 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.892338 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnqg\" (UniqueName: \"kubernetes.io/projected/df099335-bb95-4e69-9628-9f53a170b043-kube-api-access-4wnqg\") pod \"manila-operator-controller-manager-96fff9cb8-4qw8g\" (UID: \"df099335-bb95-4e69-9628-9f53a170b043\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 16:59:52 crc kubenswrapper[4694]: E0217 16:59:52.892413 4694 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:52 crc kubenswrapper[4694]: E0217 16:59:52.892477 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert podName:37924017-8b0a-4920-becb-89e528139e25 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:53.392459189 +0000 UTC m=+1061.149534503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert") pod "infra-operator-controller-manager-66d6b5f488-f8krc" (UID: "37924017-8b0a-4920-becb-89e528139e25") : secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.913509 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.919426 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75b7\" (UniqueName: \"kubernetes.io/projected/9690ac41-b444-4d39-adfb-de4dd1b4d581-kube-api-access-v75b7\") pod \"horizon-operator-controller-manager-54fb488b88-5s57w\" (UID: \"9690ac41-b444-4d39-adfb-de4dd1b4d581\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.919582 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4wc\" (UniqueName: \"kubernetes.io/projected/37924017-8b0a-4920-becb-89e528139e25-kube-api-access-gs4wc\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.920411 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxr4c\" (UniqueName: \"kubernetes.io/projected/73478819-b2d9-484f-8f31-12636ce0fff1-kube-api-access-rxr4c\") pod \"ironic-operator-controller-manager-6494cdbf8f-5zb28\" (UID: \"73478819-b2d9-484f-8f31-12636ce0fff1\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.924338 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.925246 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.928904 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-p5j8c" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.941072 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8vn\" (UniqueName: \"kubernetes.io/projected/17fc1c20-dd01-4529-99a2-5a758dd7d8f1-kube-api-access-pl8vn\") pod \"heat-operator-controller-manager-9595d6797-cc8kn\" (UID: \"17fc1c20-dd01-4529-99a2-5a758dd7d8f1\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.946037 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fj9b\" (UniqueName: \"kubernetes.io/projected/4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd-kube-api-access-4fj9b\") pod \"glance-operator-controller-manager-68c6d499cb-pkrtr\" (UID: \"4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd\") " pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.956734 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.978803 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk"] Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.979815 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.980271 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.982296 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ncfzh" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.995758 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnqg\" (UniqueName: \"kubernetes.io/projected/df099335-bb95-4e69-9628-9f53a170b043-kube-api-access-4wnqg\") pod \"manila-operator-controller-manager-96fff9cb8-4qw8g\" (UID: \"df099335-bb95-4e69-9628-9f53a170b043\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.995801 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58svc\" (UniqueName: \"kubernetes.io/projected/4ab6ce1a-7a26-4b71-be84-b27da7acf5c4-kube-api-access-58svc\") pod \"neutron-operator-controller-manager-54967dbbdf-kqwxc\" (UID: \"4ab6ce1a-7a26-4b71-be84-b27da7acf5c4\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.995845 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm5bh\" (UniqueName: \"kubernetes.io/projected/42a9eede-5ff2-40da-9491-44a8012320a2-kube-api-access-hm5bh\") pod \"keystone-operator-controller-manager-6c78d668d5-wj245\" (UID: \"42a9eede-5ff2-40da-9491-44a8012320a2\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.995870 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w96mf\" (UniqueName: \"kubernetes.io/projected/80223b5f-ef20-4fc6-b4bc-b8d63046db41-kube-api-access-w96mf\") pod \"mariadb-operator-controller-manager-66997756f6-z7ldj\" (UID: \"80223b5f-ef20-4fc6-b4bc-b8d63046db41\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 16:59:52 crc kubenswrapper[4694]: I0217 16:59:52.996382 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.002200 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.013327 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnqg\" (UniqueName: \"kubernetes.io/projected/df099335-bb95-4e69-9628-9f53a170b043-kube-api-access-4wnqg\") pod \"manila-operator-controller-manager-96fff9cb8-4qw8g\" (UID: \"df099335-bb95-4e69-9628-9f53a170b043\") " pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.014427 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w96mf\" (UniqueName: \"kubernetes.io/projected/80223b5f-ef20-4fc6-b4bc-b8d63046db41-kube-api-access-w96mf\") pod \"mariadb-operator-controller-manager-66997756f6-z7ldj\" (UID: \"80223b5f-ef20-4fc6-b4bc-b8d63046db41\") " pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.016085 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm5bh\" (UniqueName: \"kubernetes.io/projected/42a9eede-5ff2-40da-9491-44a8012320a2-kube-api-access-hm5bh\") pod \"keystone-operator-controller-manager-6c78d668d5-wj245\" (UID: \"42a9eede-5ff2-40da-9491-44a8012320a2\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.051963 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.052721 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.054146 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.060163 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ldvm4" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.063287 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.064356 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.071912 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.075515 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.076516 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nj5m8" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.086780 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.106613 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58svc\" (UniqueName: \"kubernetes.io/projected/4ab6ce1a-7a26-4b71-be84-b27da7acf5c4-kube-api-access-58svc\") pod \"neutron-operator-controller-manager-54967dbbdf-kqwxc\" (UID: \"4ab6ce1a-7a26-4b71-be84-b27da7acf5c4\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.106918 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78g8j\" (UniqueName: \"kubernetes.io/projected/26d16927-5afd-4da9-a66a-c20006f1d9e7-kube-api-access-78g8j\") pod \"octavia-operator-controller-manager-745bbbd77b-m2ntk\" (UID: \"26d16927-5afd-4da9-a66a-c20006f1d9e7\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.107022 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5pcz\" (UniqueName: \"kubernetes.io/projected/c97c1c8a-6504-4db2-ad45-0a0c2f84551f-kube-api-access-h5pcz\") pod \"nova-operator-controller-manager-5ddd85db87-8zxvf\" (UID: \"c97c1c8a-6504-4db2-ad45-0a0c2f84551f\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.121806 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.137847 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.153819 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58svc\" (UniqueName: \"kubernetes.io/projected/4ab6ce1a-7a26-4b71-be84-b27da7acf5c4-kube-api-access-58svc\") pod \"neutron-operator-controller-manager-54967dbbdf-kqwxc\" (UID: \"4ab6ce1a-7a26-4b71-be84-b27da7acf5c4\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.189191 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.190354 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.197079 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.198234 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.201883 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.203001 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qzxkx" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.203174 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.203669 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4tcm6" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.210171 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215091 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fvk9\" (UniqueName: \"kubernetes.io/projected/db08b317-d51b-471e-a235-2c9b4cd1f6f7-kube-api-access-2fvk9\") pod \"ovn-operator-controller-manager-85c99d655-pj5f2\" (UID: \"db08b317-d51b-471e-a235-2c9b4cd1f6f7\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215166 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78g8j\" (UniqueName: \"kubernetes.io/projected/26d16927-5afd-4da9-a66a-c20006f1d9e7-kube-api-access-78g8j\") pod \"octavia-operator-controller-manager-745bbbd77b-m2ntk\" (UID: \"26d16927-5afd-4da9-a66a-c20006f1d9e7\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215204 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5pcz\" (UniqueName: \"kubernetes.io/projected/c97c1c8a-6504-4db2-ad45-0a0c2f84551f-kube-api-access-h5pcz\") pod \"nova-operator-controller-manager-5ddd85db87-8zxvf\" (UID: \"c97c1c8a-6504-4db2-ad45-0a0c2f84551f\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215232 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjz5q\" (UniqueName: \"kubernetes.io/projected/c9a788f4-d0b6-4275-9b3c-33f39fe70178-kube-api-access-bjz5q\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215264 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.215404 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.224008 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.224720 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.254117 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.254991 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.256248 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5pcz\" (UniqueName: \"kubernetes.io/projected/c97c1c8a-6504-4db2-ad45-0a0c2f84551f-kube-api-access-h5pcz\") pod \"nova-operator-controller-manager-5ddd85db87-8zxvf\" (UID: \"c97c1c8a-6504-4db2-ad45-0a0c2f84551f\") " pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.256954 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.258092 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sxqh7" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.274580 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78g8j\" (UniqueName: \"kubernetes.io/projected/26d16927-5afd-4da9-a66a-c20006f1d9e7-kube-api-access-78g8j\") pod \"octavia-operator-controller-manager-745bbbd77b-m2ntk\" (UID: \"26d16927-5afd-4da9-a66a-c20006f1d9e7\") " pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.277451 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.277915 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.278569 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.283861 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lnngp" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.300776 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.304119 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318081 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rft7\" (UniqueName: \"kubernetes.io/projected/7eee970c-7e73-4420-adc3-331ee21c914c-kube-api-access-7rft7\") pod \"swift-operator-controller-manager-79558bbfbf-xgh9m\" (UID: \"7eee970c-7e73-4420-adc3-331ee21c914c\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318137 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6wq\" (UniqueName: \"kubernetes.io/projected/ad0948c6-64ee-47a8-b5da-aaa3b8f051ef-kube-api-access-ch6wq\") pod \"test-operator-controller-manager-8467ccb4c8-w98pl\" (UID: \"ad0948c6-64ee-47a8-b5da-aaa3b8f051ef\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318164 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjz5q\" (UniqueName: \"kubernetes.io/projected/c9a788f4-d0b6-4275-9b3c-33f39fe70178-kube-api-access-bjz5q\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318190 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318209 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czhgr\" (UniqueName: \"kubernetes.io/projected/3629dece-1e4c-40cf-bb56-d15d6ca8aa44-kube-api-access-czhgr\") pod \"placement-operator-controller-manager-57bd55f9b7-xgzng\" (UID: \"3629dece-1e4c-40cf-bb56-d15d6ca8aa44\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318239 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dsw9\" (UniqueName: \"kubernetes.io/projected/97487951-8888-4a3b-91fc-76d324fdf255-kube-api-access-8dsw9\") pod \"telemetry-operator-controller-manager-56dc67d744-7285z\" (UID: \"97487951-8888-4a3b-91fc-76d324fdf255\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.318283 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fvk9\" (UniqueName: \"kubernetes.io/projected/db08b317-d51b-471e-a235-2c9b4cd1f6f7-kube-api-access-2fvk9\") pod \"ovn-operator-controller-manager-85c99d655-pj5f2\" (UID: \"db08b317-d51b-471e-a235-2c9b4cd1f6f7\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.319064 4694 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.319145 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert podName:c9a788f4-d0b6-4275-9b3c-33f39fe70178 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:53.819121037 +0000 UTC m=+1061.576196411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" (UID: "c9a788f4-d0b6-4275-9b3c-33f39fe70178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.333021 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.334016 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.340784 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.347504 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rl8rv" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.361191 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjz5q\" (UniqueName: \"kubernetes.io/projected/c9a788f4-d0b6-4275-9b3c-33f39fe70178-kube-api-access-bjz5q\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.369993 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fvk9\" (UniqueName: \"kubernetes.io/projected/db08b317-d51b-471e-a235-2c9b4cd1f6f7-kube-api-access-2fvk9\") pod \"ovn-operator-controller-manager-85c99d655-pj5f2\" (UID: \"db08b317-d51b-471e-a235-2c9b4cd1f6f7\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.397693 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.419205 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.420299 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.423710 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424085 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424596 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6wq\" (UniqueName: \"kubernetes.io/projected/ad0948c6-64ee-47a8-b5da-aaa3b8f051ef-kube-api-access-ch6wq\") pod \"test-operator-controller-manager-8467ccb4c8-w98pl\" (UID: \"ad0948c6-64ee-47a8-b5da-aaa3b8f051ef\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424697 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czhgr\" (UniqueName: \"kubernetes.io/projected/3629dece-1e4c-40cf-bb56-d15d6ca8aa44-kube-api-access-czhgr\") pod \"placement-operator-controller-manager-57bd55f9b7-xgzng\" (UID: \"3629dece-1e4c-40cf-bb56-d15d6ca8aa44\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424753 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dsw9\" (UniqueName: \"kubernetes.io/projected/97487951-8888-4a3b-91fc-76d324fdf255-kube-api-access-8dsw9\") pod \"telemetry-operator-controller-manager-56dc67d744-7285z\" (UID: \"97487951-8888-4a3b-91fc-76d324fdf255\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424787 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.424853 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rft7\" (UniqueName: \"kubernetes.io/projected/7eee970c-7e73-4420-adc3-331ee21c914c-kube-api-access-7rft7\") pod \"swift-operator-controller-manager-79558bbfbf-xgh9m\" (UID: \"7eee970c-7e73-4420-adc3-331ee21c914c\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.425446 4694 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.425506 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert podName:37924017-8b0a-4920-becb-89e528139e25 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:54.425487864 +0000 UTC m=+1062.182563188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert") pod "infra-operator-controller-manager-66d6b5f488-f8krc" (UID: "37924017-8b0a-4920-becb-89e528139e25") : secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.428730 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.449529 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-p2bqq" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.457304 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czhgr\" (UniqueName: \"kubernetes.io/projected/3629dece-1e4c-40cf-bb56-d15d6ca8aa44-kube-api-access-czhgr\") pod \"placement-operator-controller-manager-57bd55f9b7-xgzng\" (UID: \"3629dece-1e4c-40cf-bb56-d15d6ca8aa44\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.472459 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rft7\" (UniqueName: \"kubernetes.io/projected/7eee970c-7e73-4420-adc3-331ee21c914c-kube-api-access-7rft7\") pod \"swift-operator-controller-manager-79558bbfbf-xgh9m\" (UID: \"7eee970c-7e73-4420-adc3-331ee21c914c\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.474707 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6wq\" (UniqueName: \"kubernetes.io/projected/ad0948c6-64ee-47a8-b5da-aaa3b8f051ef-kube-api-access-ch6wq\") pod \"test-operator-controller-manager-8467ccb4c8-w98pl\" (UID: \"ad0948c6-64ee-47a8-b5da-aaa3b8f051ef\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.474981 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dsw9\" (UniqueName: \"kubernetes.io/projected/97487951-8888-4a3b-91fc-76d324fdf255-kube-api-access-8dsw9\") pod \"telemetry-operator-controller-manager-56dc67d744-7285z\" (UID: \"97487951-8888-4a3b-91fc-76d324fdf255\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.522016 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.522820 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.531744 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhfpk\" (UniqueName: \"kubernetes.io/projected/1e4d1433-71e3-4cc3-8873-1fa5cd78e961-kube-api-access-xhfpk\") pod \"watcher-operator-controller-manager-6c469bc6bb-ch6kt\" (UID: \"1e4d1433-71e3-4cc3-8873-1fa5cd78e961\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.531843 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.531964 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd2n\" (UniqueName: \"kubernetes.io/projected/607606da-e993-49b1-98ff-a2e0c2146f8a-kube-api-access-tgd2n\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.531994 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.534373 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bs2c9" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.541947 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.552649 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.633666 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd2n\" (UniqueName: \"kubernetes.io/projected/607606da-e993-49b1-98ff-a2e0c2146f8a-kube-api-access-tgd2n\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.633915 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.633943 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhfpk\" (UniqueName: \"kubernetes.io/projected/1e4d1433-71e3-4cc3-8873-1fa5cd78e961-kube-api-access-xhfpk\") pod \"watcher-operator-controller-manager-6c469bc6bb-ch6kt\" (UID: \"1e4d1433-71e3-4cc3-8873-1fa5cd78e961\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.633986 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.634038 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8fv\" (UniqueName: \"kubernetes.io/projected/a65ffd3d-87f5-4491-b71e-9823c314bc1f-kube-api-access-tm8fv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ggsvh\" (UID: \"a65ffd3d-87f5-4491-b71e-9823c314bc1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.634363 4694 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.634421 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:54.134400254 +0000 UTC m=+1061.891475578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.634494 4694 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.634537 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:54.134521567 +0000 UTC m=+1061.891596891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "metrics-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.652476 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.661835 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhfpk\" (UniqueName: \"kubernetes.io/projected/1e4d1433-71e3-4cc3-8873-1fa5cd78e961-kube-api-access-xhfpk\") pod \"watcher-operator-controller-manager-6c469bc6bb-ch6kt\" (UID: \"1e4d1433-71e3-4cc3-8873-1fa5cd78e961\") " pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.662813 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd2n\" (UniqueName: \"kubernetes.io/projected/607606da-e993-49b1-98ff-a2e0c2146f8a-kube-api-access-tgd2n\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.679581 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.690493 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk"] Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.698693 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.719442 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.735777 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8fv\" (UniqueName: \"kubernetes.io/projected/a65ffd3d-87f5-4491-b71e-9823c314bc1f-kube-api-access-tm8fv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ggsvh\" (UID: \"a65ffd3d-87f5-4491-b71e-9823c314bc1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.738802 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.759275 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8fv\" (UniqueName: \"kubernetes.io/projected/a65ffd3d-87f5-4491-b71e-9823c314bc1f-kube-api-access-tm8fv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ggsvh\" (UID: \"a65ffd3d-87f5-4491-b71e-9823c314bc1f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.787172 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn"] Feb 17 16:59:53 crc kubenswrapper[4694]: W0217 16:59:53.807301 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c2abe7_0f53_4b11_932c_1f767a6d21b2.slice/crio-ad5eddd94a2472ab28219b3b19eb5b0b948231d99334d43e734b6c84d3168550 WatchSource:0}: Error finding container ad5eddd94a2472ab28219b3b19eb5b0b948231d99334d43e734b6c84d3168550: Status 404 returned error can't find the container with id ad5eddd94a2472ab28219b3b19eb5b0b948231d99334d43e734b6c84d3168550 Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.838992 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.839299 4694 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: E0217 16:59:53.839409 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert podName:c9a788f4-d0b6-4275-9b3c-33f39fe70178 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:54.839367597 +0000 UTC m=+1062.596442981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" (UID: "c9a788f4-d0b6-4275-9b3c-33f39fe70178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:53 crc kubenswrapper[4694]: I0217 16:59:53.877264 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.147167 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.147366 4694 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.147633 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:55.14759957 +0000 UTC m=+1062.904674894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "webhook-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.147633 4694 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.147697 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:55.147690693 +0000 UTC m=+1062.904766017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "metrics-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.147561 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.237038 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" event={"ID":"17fc1c20-dd01-4529-99a2-5a758dd7d8f1","Type":"ContainerStarted","Data":"2dae3279cd0a0b81732108b2b7b4d1d2644d2efaa08f1c97d5719112005cadd3"} Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.238488 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" event={"ID":"77c2abe7-0f53-4b11-932c-1f767a6d21b2","Type":"ContainerStarted","Data":"ad5eddd94a2472ab28219b3b19eb5b0b948231d99334d43e734b6c84d3168550"} Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.239715 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" event={"ID":"b43c51f9-625e-4499-bdcc-612213a353df","Type":"ContainerStarted","Data":"87db3fb80f05df8d9f18341a341240febff1847bbe615abc958a78e5fcab592a"} Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.252851 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.262318 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.272200 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28"] Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.280528 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9690ac41_b444_4d39_adfb_de4dd1b4d581.slice/crio-2a0451a24fad420efde83f2d93340461dc2ebe0dd81412795f7401fe8f0c002c WatchSource:0}: Error finding container 2a0451a24fad420efde83f2d93340461dc2ebe0dd81412795f7401fe8f0c002c: Status 404 returned error can't find the container with id 2a0451a24fad420efde83f2d93340461dc2ebe0dd81412795f7401fe8f0c002c Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.359198 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr"] Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.374732 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d16927_5afd_4da9_a66a_c20006f1d9e7.slice/crio-1aa8c8905a812ad032ee3536481d7a14312198e9224f54777dc19945f27c862d WatchSource:0}: Error finding container 1aa8c8905a812ad032ee3536481d7a14312198e9224f54777dc19945f27c862d: Status 404 returned error can't find the container with id 1aa8c8905a812ad032ee3536481d7a14312198e9224f54777dc19945f27c862d Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.376708 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.389734 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj"] Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.394858 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf099335_bb95_4e69_9628_9f53a170b043.slice/crio-359c56d6831414426fe9e71d2ef2a141921f5adc183f4e6ad6e93ab91ce7dc7d WatchSource:0}: Error finding container 359c56d6831414426fe9e71d2ef2a141921f5adc183f4e6ad6e93ab91ce7dc7d: Status 404 returned error can't find the container with id 359c56d6831414426fe9e71d2ef2a141921f5adc183f4e6ad6e93ab91ce7dc7d Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.396960 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.401232 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.465667 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.465809 4694 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.465853 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert podName:37924017-8b0a-4920-becb-89e528139e25 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:56.4658383 +0000 UTC m=+1064.222913614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert") pod "infra-operator-controller-manager-66d6b5f488-f8krc" (UID: "37924017-8b0a-4920-becb-89e528139e25") : secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.555814 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.585573 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.593233 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.600002 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.605383 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m"] Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.612054 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df3483a_cb85_4da5_a314_e0aea8874af8.slice/crio-9dbdaeb69b5f57b37ece68e3e429cc3d10d60dfc9808c82aa395402734e60f00 WatchSource:0}: Error finding container 9dbdaeb69b5f57b37ece68e3e429cc3d10d60dfc9808c82aa395402734e60f00: Status 404 returned error can't find the container with id 9dbdaeb69b5f57b37ece68e3e429cc3d10d60dfc9808c82aa395402734e60f00 Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.614680 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.619579 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.625598 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng"] Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.629835 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh"] Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.631201 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e4d1433_71e3_4cc3_8873_1fa5cd78e961.slice/crio-f085efabae37382d793f9e8bb48a24012637bc82688bb5e0228d0e5def36772d WatchSource:0}: Error finding container f085efabae37382d793f9e8bb48a24012637bc82688bb5e0228d0e5def36772d: Status 404 returned error can't find the container with id f085efabae37382d793f9e8bb48a24012637bc82688bb5e0228d0e5def36772d Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.634684 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc97c1c8a_6504_4db2_ad45_0a0c2f84551f.slice/crio-89a04072f953472d631f656d2efc3fe00c7fc97a9a925168b9996d8117e67570 WatchSource:0}: Error finding container 89a04072f953472d631f656d2efc3fe00c7fc97a9a925168b9996d8117e67570: Status 404 returned error can't find the container with id 89a04072f953472d631f656d2efc3fe00c7fc97a9a925168b9996d8117e67570 Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.654367 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97487951_8888_4a3b_91fc_76d324fdf255.slice/crio-43904a1f6949ac9b367397eab51270257f03e9477ae9696571bbe4c35f3a049a WatchSource:0}: Error finding container 43904a1f6949ac9b367397eab51270257f03e9477ae9696571bbe4c35f3a049a: Status 404 returned error can't find the container with id 43904a1f6949ac9b367397eab51270257f03e9477ae9696571bbe4c35f3a049a Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.654482 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b7p8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-55cc45767f-dkblx_openstack-operators(6df3483a-cb85-4da5-a314-e0aea8874af8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.655157 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eee970c_7e73_4420_adc3_331ee21c914c.slice/crio-ceb7d39e660d721fbd2c87ad3cbf46f5a086aeef9a8403ecff400df8859780ba WatchSource:0}: Error finding container ceb7d39e660d721fbd2c87ad3cbf46f5a086aeef9a8403ecff400df8859780ba: Status 404 returned error can't find the container with id ceb7d39e660d721fbd2c87ad3cbf46f5a086aeef9a8403ecff400df8859780ba Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.655561 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" podUID="6df3483a-cb85-4da5-a314-e0aea8874af8" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.655855 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8dsw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-56dc67d744-7285z_openstack-operators(97487951-8888-4a3b-91fc-76d324fdf255): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.655936 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3629dece_1e4c_40cf_bb56_d15d6ca8aa44.slice/crio-60ef1357325c08e88a79b1905337d9155394c3256ce8dd056fdd39c204695a24 WatchSource:0}: Error finding container 60ef1357325c08e88a79b1905337d9155394c3256ce8dd056fdd39c204695a24: Status 404 returned error can't find the container with id 60ef1357325c08e88a79b1905337d9155394c3256ce8dd056fdd39c204695a24 Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.657173 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" podUID="97487951-8888-4a3b-91fc-76d324fdf255" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.657852 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7rft7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-79558bbfbf-xgh9m_openstack-operators(7eee970c-7e73-4420-adc3-331ee21c914c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: W0217 16:59:54.657963 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65ffd3d_87f5_4491_b71e_9823c314bc1f.slice/crio-a6ab51289ccdbe47f2e9e811ba1a133ecf153bc591a566f2acf38cc70fb15024 WatchSource:0}: Error finding container a6ab51289ccdbe47f2e9e811ba1a133ecf153bc591a566f2acf38cc70fb15024: Status 404 returned error can't find the container with id a6ab51289ccdbe47f2e9e811ba1a133ecf153bc591a566f2acf38cc70fb15024 Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.658131 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-czhgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-xgzng_openstack-operators(3629dece-1e4c-40cf-bb56-d15d6ca8aa44): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.658912 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" podUID="7eee970c-7e73-4420-adc3-331ee21c914c" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.659935 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5pcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5ddd85db87-8zxvf_openstack-operators(c97c1c8a-6504-4db2-ad45-0a0c2f84551f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.660461 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" podUID="3629dece-1e4c-40cf-bb56-d15d6ca8aa44" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.661646 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tm8fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ggsvh_openstack-operators(a65ffd3d-87f5-4491-b71e-9823c314bc1f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.661700 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podUID="c97c1c8a-6504-4db2-ad45-0a0c2f84551f" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.663120 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podUID="a65ffd3d-87f5-4491-b71e-9823c314bc1f" Feb 17 16:59:54 crc kubenswrapper[4694]: I0217 16:59:54.875996 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.876160 4694 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:54 crc kubenswrapper[4694]: E0217 16:59:54.876206 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert podName:c9a788f4-d0b6-4275-9b3c-33f39fe70178 nodeName:}" failed. No retries permitted until 2026-02-17 16:59:56.876190727 +0000 UTC m=+1064.633266051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" (UID: "c9a788f4-d0b6-4275-9b3c-33f39fe70178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.180538 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.181476 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.181810 4694 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.182126 4694 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.182510 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:57.18197416 +0000 UTC m=+1064.939049484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "webhook-server-cert" not found Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.182542 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 16:59:57.182530464 +0000 UTC m=+1064.939605788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "metrics-server-cert" not found Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.249761 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" event={"ID":"df099335-bb95-4e69-9628-9f53a170b043","Type":"ContainerStarted","Data":"359c56d6831414426fe9e71d2ef2a141921f5adc183f4e6ad6e93ab91ce7dc7d"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.252269 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" event={"ID":"7eee970c-7e73-4420-adc3-331ee21c914c","Type":"ContainerStarted","Data":"ceb7d39e660d721fbd2c87ad3cbf46f5a086aeef9a8403ecff400df8859780ba"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.254577 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" event={"ID":"26d16927-5afd-4da9-a66a-c20006f1d9e7","Type":"ContainerStarted","Data":"1aa8c8905a812ad032ee3536481d7a14312198e9224f54777dc19945f27c862d"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.254778 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" podUID="7eee970c-7e73-4420-adc3-331ee21c914c" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.267934 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" event={"ID":"3629dece-1e4c-40cf-bb56-d15d6ca8aa44","Type":"ContainerStarted","Data":"60ef1357325c08e88a79b1905337d9155394c3256ce8dd056fdd39c204695a24"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.270139 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" podUID="3629dece-1e4c-40cf-bb56-d15d6ca8aa44" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.270913 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" event={"ID":"9690ac41-b444-4d39-adfb-de4dd1b4d581","Type":"ContainerStarted","Data":"2a0451a24fad420efde83f2d93340461dc2ebe0dd81412795f7401fe8f0c002c"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.272032 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" event={"ID":"db08b317-d51b-471e-a235-2c9b4cd1f6f7","Type":"ContainerStarted","Data":"bb7db704d44345d14e4adbf24637817b3e4dbaf0475c5076e5bf691eb2bc1b58"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.274215 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" event={"ID":"1e4d1433-71e3-4cc3-8873-1fa5cd78e961","Type":"ContainerStarted","Data":"f085efabae37382d793f9e8bb48a24012637bc82688bb5e0228d0e5def36772d"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.276459 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" event={"ID":"a65ffd3d-87f5-4491-b71e-9823c314bc1f","Type":"ContainerStarted","Data":"a6ab51289ccdbe47f2e9e811ba1a133ecf153bc591a566f2acf38cc70fb15024"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.286938 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podUID="a65ffd3d-87f5-4491-b71e-9823c314bc1f" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.289857 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" event={"ID":"4ab6ce1a-7a26-4b71-be84-b27da7acf5c4","Type":"ContainerStarted","Data":"8ff13cf16a9fc277b24c48acbd5c7376c33dcc8ccfd41625f563ee4cd04845d0"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.291577 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" event={"ID":"97487951-8888-4a3b-91fc-76d324fdf255","Type":"ContainerStarted","Data":"43904a1f6949ac9b367397eab51270257f03e9477ae9696571bbe4c35f3a049a"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.294875 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" event={"ID":"6df3483a-cb85-4da5-a314-e0aea8874af8","Type":"ContainerStarted","Data":"9dbdaeb69b5f57b37ece68e3e429cc3d10d60dfc9808c82aa395402734e60f00"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.296229 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" podUID="97487951-8888-4a3b-91fc-76d324fdf255" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.297003 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" event={"ID":"73478819-b2d9-484f-8f31-12636ce0fff1","Type":"ContainerStarted","Data":"079d4438fcff748708210c60c2fd9b79e53037348e0b0020a7cd67b2045699ae"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.297209 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546\\\"\"" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" podUID="6df3483a-cb85-4da5-a314-e0aea8874af8" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.298632 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" event={"ID":"4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd","Type":"ContainerStarted","Data":"41afd264992151a14336d50db97652da355de8964d335d76702195240ac754ed"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.304088 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" event={"ID":"c97c1c8a-6504-4db2-ad45-0a0c2f84551f","Type":"ContainerStarted","Data":"89a04072f953472d631f656d2efc3fe00c7fc97a9a925168b9996d8117e67570"} Feb 17 16:59:55 crc kubenswrapper[4694]: E0217 16:59:55.305342 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podUID="c97c1c8a-6504-4db2-ad45-0a0c2f84551f" Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.306559 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" event={"ID":"ad0948c6-64ee-47a8-b5da-aaa3b8f051ef","Type":"ContainerStarted","Data":"88b45e53cf7b50f60d0168d507000d3ba6f73705d257aff9f2a07985752cb349"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.313558 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" event={"ID":"80223b5f-ef20-4fc6-b4bc-b8d63046db41","Type":"ContainerStarted","Data":"a0206525818ff613b54b79d3714966ff2a864c528067d37e08d4103a02ac5167"} Feb 17 16:59:55 crc kubenswrapper[4694]: I0217 16:59:55.321153 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" event={"ID":"42a9eede-5ff2-40da-9491-44a8012320a2","Type":"ContainerStarted","Data":"51277b41f9bbc0a0f643fcbc91ae73959de3bb60847119b4cccb12953644ef08"} Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.339880 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" podUID="3629dece-1e4c-40cf-bb56-d15d6ca8aa44" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.339912 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" podUID="97487951-8888-4a3b-91fc-76d324fdf255" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.339942 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" podUID="7eee970c-7e73-4420-adc3-331ee21c914c" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.340009 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546\\\"\"" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" podUID="6df3483a-cb85-4da5-a314-e0aea8874af8" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.340025 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podUID="c97c1c8a-6504-4db2-ad45-0a0c2f84551f" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.340077 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podUID="a65ffd3d-87f5-4491-b71e-9823c314bc1f" Feb 17 16:59:56 crc kubenswrapper[4694]: I0217 16:59:56.504266 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.506085 4694 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.506219 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert podName:37924017-8b0a-4920-becb-89e528139e25 nodeName:}" failed. No retries permitted until 2026-02-17 17:00:00.50619733 +0000 UTC m=+1068.263272654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert") pod "infra-operator-controller-manager-66d6b5f488-f8krc" (UID: "37924017-8b0a-4920-becb-89e528139e25") : secret "infra-operator-webhook-server-cert" not found Feb 17 16:59:56 crc kubenswrapper[4694]: I0217 16:59:56.910124 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.910274 4694 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:56 crc kubenswrapper[4694]: E0217 16:59:56.910569 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert podName:c9a788f4-d0b6-4275-9b3c-33f39fe70178 nodeName:}" failed. No retries permitted until 2026-02-17 17:00:00.910548199 +0000 UTC m=+1068.667623523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" (UID: "c9a788f4-d0b6-4275-9b3c-33f39fe70178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 16:59:57 crc kubenswrapper[4694]: I0217 16:59:57.215830 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:57 crc kubenswrapper[4694]: I0217 16:59:57.215913 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 16:59:57 crc kubenswrapper[4694]: E0217 16:59:57.216066 4694 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 16:59:57 crc kubenswrapper[4694]: E0217 16:59:57.216114 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 17:00:01.216100156 +0000 UTC m=+1068.973175480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "metrics-server-cert" not found Feb 17 16:59:57 crc kubenswrapper[4694]: E0217 16:59:57.216542 4694 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 16:59:57 crc kubenswrapper[4694]: E0217 16:59:57.216579 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 17:00:01.216568358 +0000 UTC m=+1068.973643682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "webhook-server-cert" not found Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.154596 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829"] Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.156126 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.158376 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.158790 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.163664 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829"] Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.264279 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.264429 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4kmf\" (UniqueName: \"kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.264538 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.368911 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.368967 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4kmf\" (UniqueName: \"kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.368987 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.369928 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.375684 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.386649 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4kmf\" (UniqueName: \"kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf\") pod \"collect-profiles-29522460-qx829\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.515414 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.571007 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:00 crc kubenswrapper[4694]: E0217 17:00:00.571188 4694 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 17:00:00 crc kubenswrapper[4694]: E0217 17:00:00.571282 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert podName:37924017-8b0a-4920-becb-89e528139e25 nodeName:}" failed. No retries permitted until 2026-02-17 17:00:08.571263005 +0000 UTC m=+1076.328338319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert") pod "infra-operator-controller-manager-66d6b5f488-f8krc" (UID: "37924017-8b0a-4920-becb-89e528139e25") : secret "infra-operator-webhook-server-cert" not found Feb 17 17:00:00 crc kubenswrapper[4694]: I0217 17:00:00.978577 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:00 crc kubenswrapper[4694]: E0217 17:00:00.978840 4694 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 17:00:00 crc kubenswrapper[4694]: E0217 17:00:00.978994 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert podName:c9a788f4-d0b6-4275-9b3c-33f39fe70178 nodeName:}" failed. No retries permitted until 2026-02-17 17:00:08.978964276 +0000 UTC m=+1076.736039600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" (UID: "c9a788f4-d0b6-4275-9b3c-33f39fe70178") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 17:00:01 crc kubenswrapper[4694]: I0217 17:00:01.283587 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:01 crc kubenswrapper[4694]: I0217 17:00:01.283738 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:01 crc kubenswrapper[4694]: E0217 17:00:01.283745 4694 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 17:00:01 crc kubenswrapper[4694]: E0217 17:00:01.283804 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 17:00:09.283784766 +0000 UTC m=+1077.040860090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "webhook-server-cert" not found Feb 17 17:00:01 crc kubenswrapper[4694]: E0217 17:00:01.283834 4694 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 17:00:01 crc kubenswrapper[4694]: E0217 17:00:01.283869 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs podName:607606da-e993-49b1-98ff-a2e0c2146f8a nodeName:}" failed. No retries permitted until 2026-02-17 17:00:09.283858458 +0000 UTC m=+1077.040933792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs") pod "openstack-operator-controller-manager-7dbd849dbc-2qx8n" (UID: "607606da-e993-49b1-98ff-a2e0c2146f8a") : secret "metrics-server-cert" not found Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.228374 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829"] Feb 17 17:00:07 crc kubenswrapper[4694]: W0217 17:00:07.275384 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf31ae2_9449_4362_8a00_9e1fba466f0b.slice/crio-a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952 WatchSource:0}: Error finding container a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952: Status 404 returned error can't find the container with id a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952 Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.435667 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" event={"ID":"4ab6ce1a-7a26-4b71-be84-b27da7acf5c4","Type":"ContainerStarted","Data":"ba4177a11e30bc183024c009f811c5493de4514907f7960ddbfb5977871db0f1"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.436233 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.446458 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" event={"ID":"17fc1c20-dd01-4529-99a2-5a758dd7d8f1","Type":"ContainerStarted","Data":"3d761ff8a93af410f734b7c53af08deb520fac902a8d505178edc2ee98dbe60d"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.446644 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.453644 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" event={"ID":"1bf31ae2-9449-4362-8a00-9e1fba466f0b","Type":"ContainerStarted","Data":"a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.458920 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" event={"ID":"db08b317-d51b-471e-a235-2c9b4cd1f6f7","Type":"ContainerStarted","Data":"a1ffedba4fc082ca4e96d23c89d430522ab192e839780111966a01f9fec88aab"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.459477 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.471630 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" event={"ID":"1e4d1433-71e3-4cc3-8873-1fa5cd78e961","Type":"ContainerStarted","Data":"1d811813d287655337e205136afa870087cc1775ac35e9792a51dc7551c5973a"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.472368 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.480291 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" event={"ID":"77c2abe7-0f53-4b11-932c-1f767a6d21b2","Type":"ContainerStarted","Data":"ecd63825d049cad0725cd09341d0e67ed8dbe26e0e7988ae960e6d8262fbb9eb"} Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.480975 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.498663 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" podStartSLOduration=3.342008107 podStartE2EDuration="15.498648324s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.597408508 +0000 UTC m=+1062.354483832" lastFinishedPulling="2026-02-17 17:00:06.754048725 +0000 UTC m=+1074.511124049" observedRunningTime="2026-02-17 17:00:07.469314892 +0000 UTC m=+1075.226390216" watchObservedRunningTime="2026-02-17 17:00:07.498648324 +0000 UTC m=+1075.255723648" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.498788 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" podStartSLOduration=3.132174014 podStartE2EDuration="15.498772557s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.38412177 +0000 UTC m=+1062.141197094" lastFinishedPulling="2026-02-17 17:00:06.750720313 +0000 UTC m=+1074.507795637" observedRunningTime="2026-02-17 17:00:07.494937902 +0000 UTC m=+1075.252013226" watchObservedRunningTime="2026-02-17 17:00:07.498772557 +0000 UTC m=+1075.255847881" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.525490 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" podStartSLOduration=2.6600073269999998 podStartE2EDuration="15.525463414s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:53.826374817 +0000 UTC m=+1061.583450141" lastFinishedPulling="2026-02-17 17:00:06.691830864 +0000 UTC m=+1074.448906228" observedRunningTime="2026-02-17 17:00:07.52125272 +0000 UTC m=+1075.278328044" watchObservedRunningTime="2026-02-17 17:00:07.525463414 +0000 UTC m=+1075.282538738" Feb 17 17:00:07 crc kubenswrapper[4694]: I0217 17:00:07.577579 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" podStartSLOduration=2.4594542759999998 podStartE2EDuration="14.577557535s" podCreationTimestamp="2026-02-17 16:59:53 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.634589612 +0000 UTC m=+1062.391664936" lastFinishedPulling="2026-02-17 17:00:06.752692881 +0000 UTC m=+1074.509768195" observedRunningTime="2026-02-17 17:00:07.541829916 +0000 UTC m=+1075.298905240" watchObservedRunningTime="2026-02-17 17:00:07.577557535 +0000 UTC m=+1075.334632859" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.487948 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" event={"ID":"df099335-bb95-4e69-9628-9f53a170b043","Type":"ContainerStarted","Data":"a4268bf457c40d6c226dd95e071cac822d9e74ddf3886a20cbb5eadd3e24aab3"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.488971 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.489475 4694 generic.go:334] "Generic (PLEG): container finished" podID="1bf31ae2-9449-4362-8a00-9e1fba466f0b" containerID="2bac79601e7fedd63b3656671c82305768054b61a4128e337dbe3b48ff50900b" exitCode=0 Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.489511 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" event={"ID":"1bf31ae2-9449-4362-8a00-9e1fba466f0b","Type":"ContainerDied","Data":"2bac79601e7fedd63b3656671c82305768054b61a4128e337dbe3b48ff50900b"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.490845 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" event={"ID":"42a9eede-5ff2-40da-9491-44a8012320a2","Type":"ContainerStarted","Data":"22266124b1e918855f1b445078554c74dd6834120480936e93e99f50b1f3094f"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.491062 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.491938 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" event={"ID":"ad0948c6-64ee-47a8-b5da-aaa3b8f051ef","Type":"ContainerStarted","Data":"a8e293d8d32bdcc258f3d5662c8b0c101f4a2e5230cb0d1f3af04463d4ce7579"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.492009 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.493306 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" event={"ID":"80223b5f-ef20-4fc6-b4bc-b8d63046db41","Type":"ContainerStarted","Data":"97afc318eb86b98d6c1a7a7b6d217fb2185d72d7229424e05414599ee43eb886"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.493441 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.494581 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" event={"ID":"9690ac41-b444-4d39-adfb-de4dd1b4d581","Type":"ContainerStarted","Data":"90c366a2b7cd6b70abe2f0fda965cd1e56d86cd341ae6549aaacc7e5c17d1152"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.494733 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.495956 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" event={"ID":"4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd","Type":"ContainerStarted","Data":"26d8deae581d7ce611f3d646a44f8e1db69b772057d42db60908171ce987d557"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.496205 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.497226 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" event={"ID":"26d16927-5afd-4da9-a66a-c20006f1d9e7","Type":"ContainerStarted","Data":"10ef207fe563a9040c56177ca4b84f305d51790f1221f9172b798d16e2014e10"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.497360 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.498776 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" event={"ID":"b43c51f9-625e-4499-bdcc-612213a353df","Type":"ContainerStarted","Data":"70b08e104b34397005a66bf74c18a29d18cddbf3e36cffba1b0c701dfee26424"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.498881 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.500028 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" event={"ID":"73478819-b2d9-484f-8f31-12636ce0fff1","Type":"ContainerStarted","Data":"1435cd8bc92a54670b17518e7e82ed1f695d1c3081fed12d65cf8c293e9b5d6b"} Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.510171 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" podStartSLOduration=3.585120029 podStartE2EDuration="16.510157191s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:53.826312356 +0000 UTC m=+1061.583387680" lastFinishedPulling="2026-02-17 17:00:06.751349498 +0000 UTC m=+1074.508424842" observedRunningTime="2026-02-17 17:00:07.582853575 +0000 UTC m=+1075.339928899" watchObservedRunningTime="2026-02-17 17:00:08.510157191 +0000 UTC m=+1076.267232515" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.514581 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" podStartSLOduration=4.162129375 podStartE2EDuration="16.514572939s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.3995604 +0000 UTC m=+1062.156635724" lastFinishedPulling="2026-02-17 17:00:06.752003954 +0000 UTC m=+1074.509079288" observedRunningTime="2026-02-17 17:00:08.508852019 +0000 UTC m=+1076.265927353" watchObservedRunningTime="2026-02-17 17:00:08.514572939 +0000 UTC m=+1076.271648263" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.548905 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" podStartSLOduration=3.5627201680000002 podStartE2EDuration="16.548877563s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.265395559 +0000 UTC m=+1062.022470893" lastFinishedPulling="2026-02-17 17:00:07.251552964 +0000 UTC m=+1075.008628288" observedRunningTime="2026-02-17 17:00:08.534575062 +0000 UTC m=+1076.291650376" watchObservedRunningTime="2026-02-17 17:00:08.548877563 +0000 UTC m=+1076.305952887" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.683150 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.704366 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37924017-8b0a-4920-becb-89e528139e25-cert\") pod \"infra-operator-controller-manager-66d6b5f488-f8krc\" (UID: \"37924017-8b0a-4920-becb-89e528139e25\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.715507 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" podStartSLOduration=4.342505283 podStartE2EDuration="16.715485173s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.381153917 +0000 UTC m=+1062.138229241" lastFinishedPulling="2026-02-17 17:00:06.754133777 +0000 UTC m=+1074.511209131" observedRunningTime="2026-02-17 17:00:08.714847477 +0000 UTC m=+1076.471922801" watchObservedRunningTime="2026-02-17 17:00:08.715485173 +0000 UTC m=+1076.472560507" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.719801 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" podStartSLOduration=4.248589741 podStartE2EDuration="16.719779438s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.2829416 +0000 UTC m=+1062.040016924" lastFinishedPulling="2026-02-17 17:00:06.754131297 +0000 UTC m=+1074.511206621" observedRunningTime="2026-02-17 17:00:08.57148468 +0000 UTC m=+1076.328560004" watchObservedRunningTime="2026-02-17 17:00:08.719779438 +0000 UTC m=+1076.476854762" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.774754 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" podStartSLOduration=4.386758931 podStartE2EDuration="16.77473449s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.390185899 +0000 UTC m=+1062.147261223" lastFinishedPulling="2026-02-17 17:00:06.778161448 +0000 UTC m=+1074.535236782" observedRunningTime="2026-02-17 17:00:08.77349453 +0000 UTC m=+1076.530569874" watchObservedRunningTime="2026-02-17 17:00:08.77473449 +0000 UTC m=+1076.531809814" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.810626 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" podStartSLOduration=4.691574192 podStartE2EDuration="16.810597203s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.633181298 +0000 UTC m=+1062.390256622" lastFinishedPulling="2026-02-17 17:00:06.752204309 +0000 UTC m=+1074.509279633" observedRunningTime="2026-02-17 17:00:08.807391854 +0000 UTC m=+1076.564467178" watchObservedRunningTime="2026-02-17 17:00:08.810597203 +0000 UTC m=+1076.567672527" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.835964 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" podStartSLOduration=3.631887789 podStartE2EDuration="16.835940266s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:53.48181972 +0000 UTC m=+1061.238895044" lastFinishedPulling="2026-02-17 17:00:06.685872167 +0000 UTC m=+1074.442947521" observedRunningTime="2026-02-17 17:00:08.830371229 +0000 UTC m=+1076.587446573" watchObservedRunningTime="2026-02-17 17:00:08.835940266 +0000 UTC m=+1076.593015600" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.874905 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" podStartSLOduration=4.403003502 podStartE2EDuration="16.874890885s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.28007238 +0000 UTC m=+1062.037147704" lastFinishedPulling="2026-02-17 17:00:06.751959763 +0000 UTC m=+1074.509035087" observedRunningTime="2026-02-17 17:00:08.872031514 +0000 UTC m=+1076.629106838" watchObservedRunningTime="2026-02-17 17:00:08.874890885 +0000 UTC m=+1076.631966209" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.920508 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" podStartSLOduration=4.53952206 podStartE2EDuration="16.920473646s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.37153703 +0000 UTC m=+1062.128612354" lastFinishedPulling="2026-02-17 17:00:06.752488596 +0000 UTC m=+1074.509563940" observedRunningTime="2026-02-17 17:00:08.918761844 +0000 UTC m=+1076.675837178" watchObservedRunningTime="2026-02-17 17:00:08.920473646 +0000 UTC m=+1076.677548970" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.924555 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.988169 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:08 crc kubenswrapper[4694]: I0217 17:00:08.994531 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9a788f4-d0b6-4275-9b3c-33f39fe70178-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz\" (UID: \"c9a788f4-d0b6-4275-9b3c-33f39fe70178\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.025891 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.294334 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.294744 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.299413 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-metrics-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.299471 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/607606da-e993-49b1-98ff-a2e0c2146f8a-webhook-certs\") pod \"openstack-operator-controller-manager-7dbd849dbc-2qx8n\" (UID: \"607606da-e993-49b1-98ff-a2e0c2146f8a\") " pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.366743 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.512684 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.543738 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc"] Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.685890 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz"] Feb 17 17:00:09 crc kubenswrapper[4694]: W0217 17:00:09.691734 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a788f4_d0b6_4275_9b3c_33f39fe70178.slice/crio-18db5e79f627b4c47cba3e51393154d34e2a9c36bd8eabc051c8d22c7e94cbc3 WatchSource:0}: Error finding container 18db5e79f627b4c47cba3e51393154d34e2a9c36bd8eabc051c8d22c7e94cbc3: Status 404 returned error can't find the container with id 18db5e79f627b4c47cba3e51393154d34e2a9c36bd8eabc051c8d22c7e94cbc3 Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.696903 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n"] Feb 17 17:00:09 crc kubenswrapper[4694]: W0217 17:00:09.750805 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607606da_e993_49b1_98ff_a2e0c2146f8a.slice/crio-634d2a3171a41dd3287438e629300aa3166c125f027a66ee6d313f403eb41ff9 WatchSource:0}: Error finding container 634d2a3171a41dd3287438e629300aa3166c125f027a66ee6d313f403eb41ff9: Status 404 returned error can't find the container with id 634d2a3171a41dd3287438e629300aa3166c125f027a66ee6d313f403eb41ff9 Feb 17 17:00:09 crc kubenswrapper[4694]: I0217 17:00:09.829870 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.021895 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume\") pod \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.021953 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4kmf\" (UniqueName: \"kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf\") pod \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.022066 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume\") pod \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\" (UID: \"1bf31ae2-9449-4362-8a00-9e1fba466f0b\") " Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.022710 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bf31ae2-9449-4362-8a00-9e1fba466f0b" (UID: "1bf31ae2-9449-4362-8a00-9e1fba466f0b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.028787 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bf31ae2-9449-4362-8a00-9e1fba466f0b" (UID: "1bf31ae2-9449-4362-8a00-9e1fba466f0b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.033916 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf" (OuterVolumeSpecName: "kube-api-access-w4kmf") pod "1bf31ae2-9449-4362-8a00-9e1fba466f0b" (UID: "1bf31ae2-9449-4362-8a00-9e1fba466f0b"). InnerVolumeSpecName "kube-api-access-w4kmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.123827 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bf31ae2-9449-4362-8a00-9e1fba466f0b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.123864 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bf31ae2-9449-4362-8a00-9e1fba466f0b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.123873 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4kmf\" (UniqueName: \"kubernetes.io/projected/1bf31ae2-9449-4362-8a00-9e1fba466f0b-kube-api-access-w4kmf\") on node \"crc\" DevicePath \"\"" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.523729 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" event={"ID":"37924017-8b0a-4920-becb-89e528139e25","Type":"ContainerStarted","Data":"eb015925d3637614a69a7ba1900fa3dabba0eb219a8d65d36b207256d7f9e64d"} Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.527160 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" event={"ID":"607606da-e993-49b1-98ff-a2e0c2146f8a","Type":"ContainerStarted","Data":"2b5475a8edd0828574dc846209eb071d020192bfb05b0153b9b4f9ba23c2a100"} Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.527198 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" event={"ID":"607606da-e993-49b1-98ff-a2e0c2146f8a","Type":"ContainerStarted","Data":"634d2a3171a41dd3287438e629300aa3166c125f027a66ee6d313f403eb41ff9"} Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.527308 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.532751 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" event={"ID":"c9a788f4-d0b6-4275-9b3c-33f39fe70178","Type":"ContainerStarted","Data":"18db5e79f627b4c47cba3e51393154d34e2a9c36bd8eabc051c8d22c7e94cbc3"} Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.535413 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.542842 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829" event={"ID":"1bf31ae2-9449-4362-8a00-9e1fba466f0b","Type":"ContainerDied","Data":"a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952"} Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.542871 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98ebe70f19c7e5a0a2f7a729a57bbe3e4f296fd1503f9cca11ca469027e2952" Feb 17 17:00:10 crc kubenswrapper[4694]: I0217 17:00:10.565950 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" podStartSLOduration=17.565925791 podStartE2EDuration="17.565925791s" podCreationTimestamp="2026-02-17 16:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:00:10.560791924 +0000 UTC m=+1078.317867278" watchObservedRunningTime="2026-02-17 17:00:10.565925791 +0000 UTC m=+1078.323001115" Feb 17 17:00:12 crc kubenswrapper[4694]: I0217 17:00:12.777214 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-xc6sk" Feb 17 17:00:12 crc kubenswrapper[4694]: I0217 17:00:12.794667 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-57746b5ff9-fk2xd" Feb 17 17:00:12 crc kubenswrapper[4694]: I0217 17:00:12.983841 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-9595d6797-cc8kn" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.007241 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-5s57w" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.057643 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-5zb28" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.080047 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-wj245" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.091098 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-96fff9cb8-4qw8g" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.206329 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66997756f6-z7ldj" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.221147 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-kqwxc" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.231394 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68c6d499cb-pkrtr" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.307355 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-745bbbd77b-m2ntk" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.400843 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-pj5f2" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.722583 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-w98pl" Feb 17 17:00:13 crc kubenswrapper[4694]: I0217 17:00:13.742970 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c469bc6bb-ch6kt" Feb 17 17:00:19 crc kubenswrapper[4694]: I0217 17:00:19.372864 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dbd849dbc-2qx8n" Feb 17 17:00:24 crc kubenswrapper[4694]: E0217 17:00:24.922145 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b" Feb 17 17:00:24 crc kubenswrapper[4694]: E0217 17:00:24.923175 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:18.0-fr5-latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjz5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz_openstack-operators(c9a788f4-d0b6-4275-9b3c-33f39fe70178): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:00:24 crc kubenswrapper[4694]: E0217 17:00:24.925361 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" podUID="c9a788f4-d0b6-4275-9b3c-33f39fe70178" Feb 17 17:00:25 crc kubenswrapper[4694]: E0217 17:00:25.782507 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:d8f38654cb385d3ff582419746c3d68d64c43cea412622f0e5dfcb32ee5ab47b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" podUID="c9a788f4-d0b6-4275-9b3c-33f39fe70178" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.303852 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.304123 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tm8fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ggsvh_openstack-operators(a65ffd3d-87f5-4491-b71e-9823c314bc1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.305505 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podUID="a65ffd3d-87f5-4491-b71e-9823c314bc1f" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.995344 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.995577 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5pcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5ddd85db87-8zxvf_openstack-operators(c97c1c8a-6504-4db2-ad45-0a0c2f84551f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:00:26 crc kubenswrapper[4694]: E0217 17:00:26.996842 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podUID="c97c1c8a-6504-4db2-ad45-0a0c2f84551f" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.641908 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" event={"ID":"97487951-8888-4a3b-91fc-76d324fdf255","Type":"ContainerStarted","Data":"41c6e232f12d9060394093c268bba4742bf7d2bc4ad207daff9a6323dc2338d0"} Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.642348 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.643164 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" event={"ID":"6df3483a-cb85-4da5-a314-e0aea8874af8","Type":"ContainerStarted","Data":"43b703774a6c9b06d81549d39537028091c2f33218e6a1e7ff5cd255ba55e41d"} Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.643326 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.644108 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" event={"ID":"3629dece-1e4c-40cf-bb56-d15d6ca8aa44","Type":"ContainerStarted","Data":"70883ab8d2e057959e31f976f45ab1eed348370b012863ebe8cc74432437e032"} Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.644241 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.645302 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" event={"ID":"37924017-8b0a-4920-becb-89e528139e25","Type":"ContainerStarted","Data":"7480ee4085b181da0fc4697e082fb6de6278e3a18cf43dbc8b3e1851cd8ed951"} Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.645352 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.646872 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" event={"ID":"7eee970c-7e73-4420-adc3-331ee21c914c","Type":"ContainerStarted","Data":"617808b4908b907bec4a939499e77b7920e2e3d73dc61bf1dee22a3f04206407"} Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.647018 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.663358 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" podStartSLOduration=3.228561946 podStartE2EDuration="35.663339641s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.655783784 +0000 UTC m=+1062.412859108" lastFinishedPulling="2026-02-17 17:00:27.090561479 +0000 UTC m=+1094.847636803" observedRunningTime="2026-02-17 17:00:27.658090062 +0000 UTC m=+1095.415165396" watchObservedRunningTime="2026-02-17 17:00:27.663339641 +0000 UTC m=+1095.420414965" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.707422 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" podStartSLOduration=3.275078351 podStartE2EDuration="35.707402945s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.654362229 +0000 UTC m=+1062.411437553" lastFinishedPulling="2026-02-17 17:00:27.086686813 +0000 UTC m=+1094.843762147" observedRunningTime="2026-02-17 17:00:27.688553031 +0000 UTC m=+1095.445628365" watchObservedRunningTime="2026-02-17 17:00:27.707402945 +0000 UTC m=+1095.464478279" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.712478 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" podStartSLOduration=3.2874077440000002 podStartE2EDuration="35.71245619s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.657999598 +0000 UTC m=+1062.415074922" lastFinishedPulling="2026-02-17 17:00:27.083048044 +0000 UTC m=+1094.840123368" observedRunningTime="2026-02-17 17:00:27.70718223 +0000 UTC m=+1095.464257574" watchObservedRunningTime="2026-02-17 17:00:27.71245619 +0000 UTC m=+1095.469531514" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.728167 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" podStartSLOduration=18.219942878 podStartE2EDuration="35.728149006s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 17:00:09.577077171 +0000 UTC m=+1077.334152495" lastFinishedPulling="2026-02-17 17:00:27.085283259 +0000 UTC m=+1094.842358623" observedRunningTime="2026-02-17 17:00:27.72427906 +0000 UTC m=+1095.481354394" watchObservedRunningTime="2026-02-17 17:00:27.728149006 +0000 UTC m=+1095.485224330" Feb 17 17:00:27 crc kubenswrapper[4694]: I0217 17:00:27.738593 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" podStartSLOduration=3.311056535 podStartE2EDuration="35.738578582s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.657722741 +0000 UTC m=+1062.414798075" lastFinishedPulling="2026-02-17 17:00:27.085244798 +0000 UTC m=+1094.842320122" observedRunningTime="2026-02-17 17:00:27.736413759 +0000 UTC m=+1095.493489083" watchObservedRunningTime="2026-02-17 17:00:27.738578582 +0000 UTC m=+1095.495653906" Feb 17 17:00:33 crc kubenswrapper[4694]: I0217 17:00:33.124942 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-dkblx" Feb 17 17:00:33 crc kubenswrapper[4694]: I0217 17:00:33.655813 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-xgzng" Feb 17 17:00:33 crc kubenswrapper[4694]: I0217 17:00:33.687932 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-xgh9m" Feb 17 17:00:33 crc kubenswrapper[4694]: I0217 17:00:33.703830 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-7285z" Feb 17 17:00:37 crc kubenswrapper[4694]: I0217 17:00:37.898598 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:00:38 crc kubenswrapper[4694]: I0217 17:00:38.718645 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" event={"ID":"c9a788f4-d0b6-4275-9b3c-33f39fe70178","Type":"ContainerStarted","Data":"4bc07ff6ed5892698d96b660f7e24eed1da525eb72566ca355cc38fd5c0396d2"} Feb 17 17:00:38 crc kubenswrapper[4694]: I0217 17:00:38.719057 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:38 crc kubenswrapper[4694]: I0217 17:00:38.740109 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" podStartSLOduration=18.056667628 podStartE2EDuration="46.740090473s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 17:00:09.695952116 +0000 UTC m=+1077.453027440" lastFinishedPulling="2026-02-17 17:00:38.379374961 +0000 UTC m=+1106.136450285" observedRunningTime="2026-02-17 17:00:38.737299904 +0000 UTC m=+1106.494375229" watchObservedRunningTime="2026-02-17 17:00:38.740090473 +0000 UTC m=+1106.497165797" Feb 17 17:00:38 crc kubenswrapper[4694]: I0217 17:00:38.930971 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-f8krc" Feb 17 17:00:39 crc kubenswrapper[4694]: E0217 17:00:39.895762 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podUID="a65ffd3d-87f5-4491-b71e-9823c314bc1f" Feb 17 17:00:42 crc kubenswrapper[4694]: E0217 17:00:42.903136 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:ab8e8207abec9cf5da7afded75ea76d1c3d2b9ab0f8e3124f518651e38f3123c\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podUID="c97c1c8a-6504-4db2-ad45-0a0c2f84551f" Feb 17 17:00:44 crc kubenswrapper[4694]: I0217 17:00:44.623962 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:00:44 crc kubenswrapper[4694]: I0217 17:00:44.624036 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:00:49 crc kubenswrapper[4694]: I0217 17:00:49.035644 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz" Feb 17 17:00:55 crc kubenswrapper[4694]: I0217 17:00:55.850356 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" event={"ID":"a65ffd3d-87f5-4491-b71e-9823c314bc1f","Type":"ContainerStarted","Data":"684d81d02bd4da86fb45fcb279d722694736fd595659e2b91399229740badfa0"} Feb 17 17:00:55 crc kubenswrapper[4694]: I0217 17:00:55.889158 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ggsvh" podStartSLOduration=2.474102881 podStartE2EDuration="1m2.889130763s" podCreationTimestamp="2026-02-17 16:59:53 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.661550686 +0000 UTC m=+1062.418626010" lastFinishedPulling="2026-02-17 17:00:55.076578528 +0000 UTC m=+1122.833653892" observedRunningTime="2026-02-17 17:00:55.87439735 +0000 UTC m=+1123.631472664" watchObservedRunningTime="2026-02-17 17:00:55.889130763 +0000 UTC m=+1123.646206117" Feb 17 17:00:58 crc kubenswrapper[4694]: I0217 17:00:58.872001 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" event={"ID":"c97c1c8a-6504-4db2-ad45-0a0c2f84551f","Type":"ContainerStarted","Data":"1218ae52e1a94d067750b5cde7d7183cb93f717bcec299290a87f123f31fbd22"} Feb 17 17:00:58 crc kubenswrapper[4694]: I0217 17:00:58.872675 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 17:00:58 crc kubenswrapper[4694]: I0217 17:00:58.896982 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" podStartSLOduration=3.214750321 podStartE2EDuration="1m6.896967578s" podCreationTimestamp="2026-02-17 16:59:52 +0000 UTC" firstStartedPulling="2026-02-17 16:59:54.659866494 +0000 UTC m=+1062.416941818" lastFinishedPulling="2026-02-17 17:00:58.342083751 +0000 UTC m=+1126.099159075" observedRunningTime="2026-02-17 17:00:58.893462842 +0000 UTC m=+1126.650538166" watchObservedRunningTime="2026-02-17 17:00:58.896967578 +0000 UTC m=+1126.654042892" Feb 17 17:01:03 crc kubenswrapper[4694]: I0217 17:01:03.280854 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5ddd85db87-8zxvf" Feb 17 17:01:14 crc kubenswrapper[4694]: I0217 17:01:14.618541 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:01:14 crc kubenswrapper[4694]: I0217 17:01:14.619079 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.821298 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:21 crc kubenswrapper[4694]: E0217 17:01:21.822714 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf31ae2-9449-4362-8a00-9e1fba466f0b" containerName="collect-profiles" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.822732 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf31ae2-9449-4362-8a00-9e1fba466f0b" containerName="collect-profiles" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.822902 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf31ae2-9449-4362-8a00-9e1fba466f0b" containerName="collect-profiles" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.823902 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.827120 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.827901 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vp8jg" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.828086 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.828231 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.845011 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.858092 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.859427 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.862030 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.898094 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.968941 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.968993 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.969352 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69hr\" (UniqueName: \"kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.969664 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:21 crc kubenswrapper[4694]: I0217 17:01:21.969845 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mp7r\" (UniqueName: \"kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.072262 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mp7r\" (UniqueName: \"kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.072419 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.072447 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.072683 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x69hr\" (UniqueName: \"kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.072763 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.073279 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.073869 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.073932 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.094018 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mp7r\" (UniqueName: \"kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r\") pod \"dnsmasq-dns-78dd6ddcc-qqz7q\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.096840 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69hr\" (UniqueName: \"kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr\") pod \"dnsmasq-dns-675f4bcbfc-76j6h\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.151522 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.188017 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.608550 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:22 crc kubenswrapper[4694]: I0217 17:01:22.689686 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:22 crc kubenswrapper[4694]: W0217 17:01:22.691544 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6daf861_6f6c_495b_9cea_1334de9f4f90.slice/crio-c04f1353453a4b79fedf5d9d1730d91b86b8576a3e2e2ff1802b11762fcddd6b WatchSource:0}: Error finding container c04f1353453a4b79fedf5d9d1730d91b86b8576a3e2e2ff1802b11762fcddd6b: Status 404 returned error can't find the container with id c04f1353453a4b79fedf5d9d1730d91b86b8576a3e2e2ff1802b11762fcddd6b Feb 17 17:01:23 crc kubenswrapper[4694]: I0217 17:01:23.027881 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" event={"ID":"e6daf861-6f6c-495b-9cea-1334de9f4f90","Type":"ContainerStarted","Data":"c04f1353453a4b79fedf5d9d1730d91b86b8576a3e2e2ff1802b11762fcddd6b"} Feb 17 17:01:23 crc kubenswrapper[4694]: I0217 17:01:23.029473 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" event={"ID":"9043fd08-332a-435f-a998-3b68773525fe","Type":"ContainerStarted","Data":"24f0d430f711a1ff8f26e30a2c726bc9580577b8b8516fb99bd2883c715b00ac"} Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.757658 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.784324 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.785416 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.808521 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.821943 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.824818 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.824952 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbz5d\" (UniqueName: \"kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.926529 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.926600 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbz5d\" (UniqueName: \"kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.926738 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.927413 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.928499 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:24 crc kubenswrapper[4694]: I0217 17:01:24.966857 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbz5d\" (UniqueName: \"kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d\") pod \"dnsmasq-dns-666b6646f7-8plwp\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.108936 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.163863 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.218689 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.219926 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.231419 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.231508 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.231579 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7rb\" (UniqueName: \"kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.230379 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.334266 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.334347 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7rb\" (UniqueName: \"kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.334401 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.335283 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.335409 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.374089 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7rb\" (UniqueName: \"kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb\") pod \"dnsmasq-dns-57d769cc4f-dxdcp\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.541664 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.647819 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.945532 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.948205 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.961724 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.972883 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.973007 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.973159 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.973378 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.973535 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.973722 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 17:01:25 crc kubenswrapper[4694]: I0217 17:01:25.975379 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6ddwk" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.046705 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.046788 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.046838 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.046861 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.046888 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047070 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047195 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047251 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047276 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plc8h\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047305 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.047333 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148548 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148620 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148649 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148663 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plc8h\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148683 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148703 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148722 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148783 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148802 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148837 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.148864 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.149283 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.152185 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.153037 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.155876 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.156200 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.157678 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.158010 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.160146 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.160812 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.166178 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.185403 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plc8h\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.191090 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.276180 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.321218 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.322787 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.327785 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.328030 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.328347 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.328493 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k2bk5" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.328716 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.329820 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.329910 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.384473 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473547 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473633 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473654 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473681 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473698 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473803 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473825 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473881 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.473962 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.474065 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgzrz\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.474088 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575298 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575653 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575673 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575688 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575725 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgzrz\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575745 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575780 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575817 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575839 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575880 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.575895 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.576234 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.576404 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.576796 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.576839 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.577130 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.578012 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.582248 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.582382 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.590591 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.593472 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.595428 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgzrz\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.606292 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:26 crc kubenswrapper[4694]: I0217 17:01:26.646339 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.535027 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.536166 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.537816 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.538509 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.538915 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gwtts" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.539119 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.545874 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.548981 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690487 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690639 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29bsl\" (UniqueName: \"kubernetes.io/projected/a0c805cf-d310-4594-8584-1061330e4c94-kube-api-access-29bsl\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690690 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690724 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690767 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690799 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690838 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.690869 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0c805cf-d310-4594-8584-1061330e4c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797297 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797366 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29bsl\" (UniqueName: \"kubernetes.io/projected/a0c805cf-d310-4594-8584-1061330e4c94-kube-api-access-29bsl\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797409 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797450 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797509 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797558 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797593 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.797661 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0c805cf-d310-4594-8584-1061330e4c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.798111 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0c805cf-d310-4594-8584-1061330e4c94-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.798955 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.804043 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.806340 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.807004 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.809086 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0c805cf-d310-4594-8584-1061330e4c94-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.817754 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c805cf-d310-4594-8584-1061330e4c94-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.850213 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29bsl\" (UniqueName: \"kubernetes.io/projected/a0c805cf-d310-4594-8584-1061330e4c94-kube-api-access-29bsl\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:27 crc kubenswrapper[4694]: I0217 17:01:27.861421 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a0c805cf-d310-4594-8584-1061330e4c94\") " pod="openstack/openstack-galera-0" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.159725 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.919302 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.920422 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.920556 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.924198 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.924395 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.924508 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rmlhn" Feb 17 17:01:28 crc kubenswrapper[4694]: I0217 17:01:28.924682 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.016988 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017056 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017098 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017143 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017174 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017216 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017236 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.017288 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6b9d\" (UniqueName: \"kubernetes.io/projected/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kube-api-access-c6b9d\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.101799 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" event={"ID":"e554efe9-fa59-446a-a376-a7339a08bf7c","Type":"ContainerStarted","Data":"6ea506d1ee1c0dadfb9b06a3a28b7c4aac7dc9ec533c81d4b08516ef8244d64c"} Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.118830 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.118890 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.118921 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6b9d\" (UniqueName: \"kubernetes.io/projected/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kube-api-access-c6b9d\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.118981 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.119024 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.119311 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.119591 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.119863 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.119905 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.120868 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.121186 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.121750 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.122553 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.127344 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.127976 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.141728 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6b9d\" (UniqueName: \"kubernetes.io/projected/a7cab10b-d837-44e6-81c7-8bfdb36a4d3c-kube-api-access-c6b9d\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.154403 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c\") " pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.197397 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.199934 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.202708 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.202829 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.202878 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nznsg" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.213163 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.245245 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.323480 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kolla-config\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.323541 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.323565 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-config-data\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.323770 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.323896 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbswj\" (UniqueName: \"kubernetes.io/projected/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kube-api-access-lbswj\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.425649 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kolla-config\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.425719 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.425753 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-config-data\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.425803 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.425860 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbswj\" (UniqueName: \"kubernetes.io/projected/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kube-api-access-lbswj\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.426701 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kolla-config\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.426817 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f9dba1-ca35-4d40-b07b-44a141b8a80b-config-data\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.429029 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.429358 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f9dba1-ca35-4d40-b07b-44a141b8a80b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.451256 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbswj\" (UniqueName: \"kubernetes.io/projected/11f9dba1-ca35-4d40-b07b-44a141b8a80b-kube-api-access-lbswj\") pod \"memcached-0\" (UID: \"11f9dba1-ca35-4d40-b07b-44a141b8a80b\") " pod="openstack/memcached-0" Feb 17 17:01:29 crc kubenswrapper[4694]: I0217 17:01:29.526996 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.535109 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.536186 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.538547 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xnvg5" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.548510 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.660083 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgbfr\" (UniqueName: \"kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr\") pod \"kube-state-metrics-0\" (UID: \"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0\") " pod="openstack/kube-state-metrics-0" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.762236 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgbfr\" (UniqueName: \"kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr\") pod \"kube-state-metrics-0\" (UID: \"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0\") " pod="openstack/kube-state-metrics-0" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.780691 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgbfr\" (UniqueName: \"kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr\") pod \"kube-state-metrics-0\" (UID: \"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0\") " pod="openstack/kube-state-metrics-0" Feb 17 17:01:31 crc kubenswrapper[4694]: I0217 17:01:31.853891 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.049927 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.057830 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.078443 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.078733 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.078955 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.079088 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.080070 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ggn7x" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.132683 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.207506 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-stczv"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.208400 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.212351 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.212529 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.212657 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4vn2j" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.227689 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232236 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232286 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232331 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232356 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232390 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232439 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232484 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8ft\" (UniqueName: \"kubernetes.io/projected/abf651b7-0b06-4b95-916e-e7fe6630d272-kube-api-access-5s8ft\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.232514 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.246442 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-q88vj"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.248525 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.287360 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-q88vj"] Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335394 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkzl\" (UniqueName: \"kubernetes.io/projected/adef318b-03c0-4281-8b77-30b76a8904e6-kube-api-access-dxkzl\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335444 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335467 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-ovn-controller-tls-certs\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335497 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335514 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335529 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45514b0e-57f3-494a-823a-2a0f0c2f728d-scripts\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335554 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335571 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-lib\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335585 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335607 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-log\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335625 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-combined-ca-bundle\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335664 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk4gd\" (UniqueName: \"kubernetes.io/projected/45514b0e-57f3-494a-823a-2a0f0c2f728d-kube-api-access-mk4gd\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335697 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335713 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-log-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335747 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8ft\" (UniqueName: \"kubernetes.io/projected/abf651b7-0b06-4b95-916e-e7fe6630d272-kube-api-access-5s8ft\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335764 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335792 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-etc-ovs\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335809 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335823 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-run\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335854 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adef318b-03c0-4281-8b77-30b76a8904e6-scripts\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.335900 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.337548 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-config\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.337577 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abf651b7-0b06-4b95-916e-e7fe6630d272-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.339030 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.339037 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.345391 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.346623 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.347109 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abf651b7-0b06-4b95-916e-e7fe6630d272-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.360935 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8ft\" (UniqueName: \"kubernetes.io/projected/abf651b7-0b06-4b95-916e-e7fe6630d272-kube-api-access-5s8ft\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.371845 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"abf651b7-0b06-4b95-916e-e7fe6630d272\") " pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437518 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-lib\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437888 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437918 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-log\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437940 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-combined-ca-bundle\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437967 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk4gd\" (UniqueName: \"kubernetes.io/projected/45514b0e-57f3-494a-823a-2a0f0c2f728d-kube-api-access-mk4gd\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.437997 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-log-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438049 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438072 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-etc-ovs\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438098 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-run\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438160 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adef318b-03c0-4281-8b77-30b76a8904e6-scripts\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438210 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkzl\" (UniqueName: \"kubernetes.io/projected/adef318b-03c0-4281-8b77-30b76a8904e6-kube-api-access-dxkzl\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438237 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-ovn-controller-tls-certs\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.438280 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45514b0e-57f3-494a-823a-2a0f0c2f728d-scripts\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.440566 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45514b0e-57f3-494a-823a-2a0f0c2f728d-scripts\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.441364 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.441479 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-run\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.441626 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-log\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.441903 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/45514b0e-57f3-494a-823a-2a0f0c2f728d-var-log-ovn\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.441968 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-run\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.442113 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-etc-ovs\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.442240 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.442454 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/adef318b-03c0-4281-8b77-30b76a8904e6-var-lib\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.446686 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adef318b-03c0-4281-8b77-30b76a8904e6-scripts\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.460264 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-ovn-controller-tls-certs\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.460379 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45514b0e-57f3-494a-823a-2a0f0c2f728d-combined-ca-bundle\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.464698 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkzl\" (UniqueName: \"kubernetes.io/projected/adef318b-03c0-4281-8b77-30b76a8904e6-kube-api-access-dxkzl\") pod \"ovn-controller-ovs-q88vj\" (UID: \"adef318b-03c0-4281-8b77-30b76a8904e6\") " pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.467116 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk4gd\" (UniqueName: \"kubernetes.io/projected/45514b0e-57f3-494a-823a-2a0f0c2f728d-kube-api-access-mk4gd\") pod \"ovn-controller-stczv\" (UID: \"45514b0e-57f3-494a-823a-2a0f0c2f728d\") " pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.525429 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv" Feb 17 17:01:35 crc kubenswrapper[4694]: I0217 17:01:35.593892 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.630105 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.634871 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.638495 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.638955 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qg2b4" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.639032 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.640785 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.644305 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793488 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-config\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793546 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793574 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793763 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793838 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.793977 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.794119 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.794193 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ql86\" (UniqueName: \"kubernetes.io/projected/391b1870-6558-40d0-be12-d31b3a57ed32-kube-api-access-6ql86\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.896302 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.896482 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.896675 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.896766 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.896927 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.897039 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.897115 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ql86\" (UniqueName: \"kubernetes.io/projected/391b1870-6558-40d0-be12-d31b3a57ed32-kube-api-access-6ql86\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.897284 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-config\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.897333 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.897942 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.898859 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.900688 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391b1870-6558-40d0-be12-d31b3a57ed32-config\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.902836 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.903863 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.904063 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/391b1870-6558-40d0-be12-d31b3a57ed32-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.921774 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ql86\" (UniqueName: \"kubernetes.io/projected/391b1870-6558-40d0-be12-d31b3a57ed32-kube-api-access-6ql86\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:38 crc kubenswrapper[4694]: I0217 17:01:38.929261 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"391b1870-6558-40d0-be12-d31b3a57ed32\") " pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:39 crc kubenswrapper[4694]: I0217 17:01:39.008783 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.096722 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.097197 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6mp7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qqz7q_openstack(e6daf861-6f6c-495b-9cea-1334de9f4f90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.098919 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" podUID="e6daf861-6f6c-495b-9cea-1334de9f4f90" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.130264 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.130424 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x69hr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-76j6h_openstack(9043fd08-332a-435f-a998-3b68773525fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:01:40 crc kubenswrapper[4694]: E0217 17:01:40.131853 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" podUID="9043fd08-332a-435f-a998-3b68773525fe" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.657674 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.667401 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.751157 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config\") pod \"e6daf861-6f6c-495b-9cea-1334de9f4f90\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.751233 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config\") pod \"9043fd08-332a-435f-a998-3b68773525fe\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.751315 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mp7r\" (UniqueName: \"kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r\") pod \"e6daf861-6f6c-495b-9cea-1334de9f4f90\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.751381 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x69hr\" (UniqueName: \"kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr\") pod \"9043fd08-332a-435f-a998-3b68773525fe\" (UID: \"9043fd08-332a-435f-a998-3b68773525fe\") " Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.751415 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc\") pod \"e6daf861-6f6c-495b-9cea-1334de9f4f90\" (UID: \"e6daf861-6f6c-495b-9cea-1334de9f4f90\") " Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.752028 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6daf861-6f6c-495b-9cea-1334de9f4f90" (UID: "e6daf861-6f6c-495b-9cea-1334de9f4f90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.752030 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config" (OuterVolumeSpecName: "config") pod "9043fd08-332a-435f-a998-3b68773525fe" (UID: "9043fd08-332a-435f-a998-3b68773525fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.752314 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config" (OuterVolumeSpecName: "config") pod "e6daf861-6f6c-495b-9cea-1334de9f4f90" (UID: "e6daf861-6f6c-495b-9cea-1334de9f4f90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.758875 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r" (OuterVolumeSpecName: "kube-api-access-6mp7r") pod "e6daf861-6f6c-495b-9cea-1334de9f4f90" (UID: "e6daf861-6f6c-495b-9cea-1334de9f4f90"). InnerVolumeSpecName "kube-api-access-6mp7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.760271 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr" (OuterVolumeSpecName: "kube-api-access-x69hr") pod "9043fd08-332a-435f-a998-3b68773525fe" (UID: "9043fd08-332a-435f-a998-3b68773525fe"). InnerVolumeSpecName "kube-api-access-x69hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.792284 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:01:40 crc kubenswrapper[4694]: W0217 17:01:40.795868 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647b8309_483b_4f58_8360_202bb4b14824.slice/crio-0bf4341153fbff008ea6884a8b134690cb8e506e1d6358c7fcd1e75f26b63914 WatchSource:0}: Error finding container 0bf4341153fbff008ea6884a8b134690cb8e506e1d6358c7fcd1e75f26b63914: Status 404 returned error can't find the container with id 0bf4341153fbff008ea6884a8b134690cb8e506e1d6358c7fcd1e75f26b63914 Feb 17 17:01:40 crc kubenswrapper[4694]: W0217 17:01:40.819429 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c805cf_d310_4594_8584_1061330e4c94.slice/crio-7b46064e32fdf3557cdb8748f5acf683c8c601a495b5cdb842111829e3d31b93 WatchSource:0}: Error finding container 7b46064e32fdf3557cdb8748f5acf683c8c601a495b5cdb842111829e3d31b93: Status 404 returned error can't find the container with id 7b46064e32fdf3557cdb8748f5acf683c8c601a495b5cdb842111829e3d31b93 Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.821053 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 17:01:40 crc kubenswrapper[4694]: W0217 17:01:40.827087 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cab10b_d837_44e6_81c7_8bfdb36a4d3c.slice/crio-74c0bb43098fb70e03dcafa385b7538a3f3b48daa0636ce3c51f6c641d694ca2 WatchSource:0}: Error finding container 74c0bb43098fb70e03dcafa385b7538a3f3b48daa0636ce3c51f6c641d694ca2: Status 404 returned error can't find the container with id 74c0bb43098fb70e03dcafa385b7538a3f3b48daa0636ce3c51f6c641d694ca2 Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.827839 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 17:01:40 crc kubenswrapper[4694]: W0217 17:01:40.833816 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee6f4177_5e36_4e81_a1fe_ed0a715df304.slice/crio-d61cfcf41a4bb12dd7828fd537b0189011e350e3543c71e95f8025b25e327b7c WatchSource:0}: Error finding container d61cfcf41a4bb12dd7828fd537b0189011e350e3543c71e95f8025b25e327b7c: Status 404 returned error can't find the container with id d61cfcf41a4bb12dd7828fd537b0189011e350e3543c71e95f8025b25e327b7c Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.839408 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.854943 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mp7r\" (UniqueName: \"kubernetes.io/projected/e6daf861-6f6c-495b-9cea-1334de9f4f90-kube-api-access-6mp7r\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.854977 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x69hr\" (UniqueName: \"kubernetes.io/projected/9043fd08-332a-435f-a998-3b68773525fe-kube-api-access-x69hr\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.855054 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.855084 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6daf861-6f6c-495b-9cea-1334de9f4f90-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.855095 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9043fd08-332a-435f-a998-3b68773525fe-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:40 crc kubenswrapper[4694]: I0217 17:01:40.858362 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.199467 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerStarted","Data":"6dc3707fedafb2c932e2dc60e99c8c2b30db0073dfe0206f4b2be22b7167ed7c"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.201424 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" event={"ID":"9043fd08-332a-435f-a998-3b68773525fe","Type":"ContainerDied","Data":"24f0d430f711a1ff8f26e30a2c726bc9580577b8b8516fb99bd2883c715b00ac"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.201487 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-76j6h" Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.203698 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0c805cf-d310-4594-8584-1061330e4c94","Type":"ContainerStarted","Data":"7b46064e32fdf3557cdb8748f5acf683c8c601a495b5cdb842111829e3d31b93"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.204596 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" event={"ID":"e6daf861-6f6c-495b-9cea-1334de9f4f90","Type":"ContainerDied","Data":"c04f1353453a4b79fedf5d9d1730d91b86b8576a3e2e2ff1802b11762fcddd6b"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.204704 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qqz7q" Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.207562 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerStarted","Data":"0bf4341153fbff008ea6884a8b134690cb8e506e1d6358c7fcd1e75f26b63914"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.211808 4694 generic.go:334] "Generic (PLEG): container finished" podID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerID="b2ffc52c4df72d0a6151f99c7a19941e9d273995bdeb1009b90ad65499132677" exitCode=0 Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.211907 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" event={"ID":"ee6f4177-5e36-4e81-a1fe-ed0a715df304","Type":"ContainerDied","Data":"b2ffc52c4df72d0a6151f99c7a19941e9d273995bdeb1009b90ad65499132677"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.211939 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" event={"ID":"ee6f4177-5e36-4e81-a1fe-ed0a715df304","Type":"ContainerStarted","Data":"d61cfcf41a4bb12dd7828fd537b0189011e350e3543c71e95f8025b25e327b7c"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.214911 4694 generic.go:334] "Generic (PLEG): container finished" podID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerID="2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76" exitCode=0 Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.214977 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" event={"ID":"e554efe9-fa59-446a-a376-a7339a08bf7c","Type":"ContainerDied","Data":"2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.217140 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c","Type":"ContainerStarted","Data":"74c0bb43098fb70e03dcafa385b7538a3f3b48daa0636ce3c51f6c641d694ca2"} Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.252601 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.326569 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.334705 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-76j6h"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.357061 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.369312 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.375537 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.384246 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qqz7q"] Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.404002 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 17:01:41 crc kubenswrapper[4694]: E0217 17:01:41.446815 4694 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 17 17:01:41 crc kubenswrapper[4694]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e554efe9-fa59-446a-a376-a7339a08bf7c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 17:01:41 crc kubenswrapper[4694]: > podSandboxID="6ea506d1ee1c0dadfb9b06a3a28b7c4aac7dc9ec533c81d4b08516ef8244d64c" Feb 17 17:01:41 crc kubenswrapper[4694]: E0217 17:01:41.447304 4694 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 17 17:01:41 crc kubenswrapper[4694]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbz5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-8plwp_openstack(e554efe9-fa59-446a-a376-a7339a08bf7c): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e554efe9-fa59-446a-a376-a7339a08bf7c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 17 17:01:41 crc kubenswrapper[4694]: > logger="UnhandledError" Feb 17 17:01:41 crc kubenswrapper[4694]: E0217 17:01:41.448658 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e554efe9-fa59-446a-a376-a7339a08bf7c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" Feb 17 17:01:41 crc kubenswrapper[4694]: I0217 17:01:41.970848 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-q88vj"] Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.075940 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 17:01:42 crc kubenswrapper[4694]: W0217 17:01:42.117533 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf651b7_0b06_4b95_916e_e7fe6630d272.slice/crio-610a93dd54afc7ce78be40ba4cecc7f78773216435e47eb6743d1c803919164d WatchSource:0}: Error finding container 610a93dd54afc7ce78be40ba4cecc7f78773216435e47eb6743d1c803919164d: Status 404 returned error can't find the container with id 610a93dd54afc7ce78be40ba4cecc7f78773216435e47eb6743d1c803919164d Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.233657 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv" event={"ID":"45514b0e-57f3-494a-823a-2a0f0c2f728d","Type":"ContainerStarted","Data":"656a77857363d101935b2b0be59f209b3a3ce6e9d971a858210995058fd1d462"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.235784 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"391b1870-6558-40d0-be12-d31b3a57ed32","Type":"ContainerStarted","Data":"81432c65c1211356b4594b20711d6cfb59be32ca7f808f2fe0b48060eca0b5b7"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.236748 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0","Type":"ContainerStarted","Data":"dc1df6f66b8a912b9919e5143896ab457e090809278ab63c016f85a74fa75010"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.239554 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-q88vj" event={"ID":"adef318b-03c0-4281-8b77-30b76a8904e6","Type":"ContainerStarted","Data":"c3c13a10aec43bd7799984527d559acc97ad513172c06bac61b855ef26e0d510"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.242025 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" event={"ID":"ee6f4177-5e36-4e81-a1fe-ed0a715df304","Type":"ContainerStarted","Data":"be87d18c7c9fae6df4b7d7c5487b8b52c6b1fb47aae5c7dbc482fd06e9e7d1df"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.242584 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.243357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf651b7-0b06-4b95-916e-e7fe6630d272","Type":"ContainerStarted","Data":"610a93dd54afc7ce78be40ba4cecc7f78773216435e47eb6743d1c803919164d"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.246654 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"11f9dba1-ca35-4d40-b07b-44a141b8a80b","Type":"ContainerStarted","Data":"4a1cc54467b5faace71d5261d2fcaa25c6ba8fc1b09d6c53a7770d5bd9859ed2"} Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.262053 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" podStartSLOduration=17.262034573 podStartE2EDuration="17.262034573s" podCreationTimestamp="2026-02-17 17:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:01:42.258206008 +0000 UTC m=+1170.015281362" watchObservedRunningTime="2026-02-17 17:01:42.262034573 +0000 UTC m=+1170.019109897" Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.907179 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9043fd08-332a-435f-a998-3b68773525fe" path="/var/lib/kubelet/pods/9043fd08-332a-435f-a998-3b68773525fe/volumes" Feb 17 17:01:42 crc kubenswrapper[4694]: I0217 17:01:42.908095 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6daf861-6f6c-495b-9cea-1334de9f4f90" path="/var/lib/kubelet/pods/e6daf861-6f6c-495b-9cea-1334de9f4f90/volumes" Feb 17 17:01:44 crc kubenswrapper[4694]: I0217 17:01:44.618122 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:01:44 crc kubenswrapper[4694]: I0217 17:01:44.618471 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:01:44 crc kubenswrapper[4694]: I0217 17:01:44.618517 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:01:44 crc kubenswrapper[4694]: I0217 17:01:44.619189 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:01:44 crc kubenswrapper[4694]: I0217 17:01:44.619244 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658" gracePeriod=600 Feb 17 17:01:45 crc kubenswrapper[4694]: I0217 17:01:45.273203 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658" exitCode=0 Feb 17 17:01:45 crc kubenswrapper[4694]: I0217 17:01:45.273253 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658"} Feb 17 17:01:45 crc kubenswrapper[4694]: I0217 17:01:45.273288 4694 scope.go:117] "RemoveContainer" containerID="d4749332bdc4a5e5d10099fdef7b4d20f81424c0b600631d13aa0f1be1b09107" Feb 17 17:01:48 crc kubenswrapper[4694]: I0217 17:01:48.296393 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.305402 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" event={"ID":"e554efe9-fa59-446a-a376-a7339a08bf7c","Type":"ContainerStarted","Data":"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.306194 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.307727 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"11f9dba1-ca35-4d40-b07b-44a141b8a80b","Type":"ContainerStarted","Data":"138529bd3f1031596949eac5681923f9e7388326082d6e187f40d1915089389c"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.307836 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.309858 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"391b1870-6558-40d0-be12-d31b3a57ed32","Type":"ContainerStarted","Data":"09f1df4556df27ca4db4e5b13ca23c72d73fecf0f9770057a590f12291bafebd"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.311425 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c","Type":"ContainerStarted","Data":"52ba920e44ae807a9721919f1305c3ec54027076bac030bb31c59d9c1277687a"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.312655 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-q88vj" event={"ID":"adef318b-03c0-4281-8b77-30b76a8904e6","Type":"ContainerStarted","Data":"5bbaeeb652857d08d1dffc26a1dadba4e60124f4943b3e81f2d1f5a7ac12725b"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.313923 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf651b7-0b06-4b95-916e-e7fe6630d272","Type":"ContainerStarted","Data":"221d40e32dab687f89f8f71a874d8e8f3577e0f3aaa3b109f85eb6a4ce46a37b"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.315347 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv" event={"ID":"45514b0e-57f3-494a-823a-2a0f0c2f728d","Type":"ContainerStarted","Data":"d48bfdccc6e4f2a86e321e6decf57cb09ec16051fa7e66c587df43df8add5986"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.315455 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-stczv" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.318009 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0","Type":"ContainerStarted","Data":"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.318125 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.319809 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0c805cf-d310-4594-8584-1061330e4c94","Type":"ContainerStarted","Data":"81541620d267325bb3505a8a61175d5bf5494eb53d9625d7b2d027136080e556"} Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.330719 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" podStartSLOduration=13.350603979 podStartE2EDuration="25.330701179s" podCreationTimestamp="2026-02-17 17:01:24 +0000 UTC" firstStartedPulling="2026-02-17 17:01:28.198857555 +0000 UTC m=+1155.955932879" lastFinishedPulling="2026-02-17 17:01:40.178954755 +0000 UTC m=+1167.936030079" observedRunningTime="2026-02-17 17:01:49.32423677 +0000 UTC m=+1177.081312104" watchObservedRunningTime="2026-02-17 17:01:49.330701179 +0000 UTC m=+1177.087776493" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.343765 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-stczv" podStartSLOduration=7.585111598 podStartE2EDuration="14.343691859s" podCreationTimestamp="2026-02-17 17:01:35 +0000 UTC" firstStartedPulling="2026-02-17 17:01:41.33918558 +0000 UTC m=+1169.096260904" lastFinishedPulling="2026-02-17 17:01:48.097765841 +0000 UTC m=+1175.854841165" observedRunningTime="2026-02-17 17:01:49.339362763 +0000 UTC m=+1177.096438097" watchObservedRunningTime="2026-02-17 17:01:49.343691859 +0000 UTC m=+1177.100767183" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.353847 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.012965512 podStartE2EDuration="18.353829759s" podCreationTimestamp="2026-02-17 17:01:31 +0000 UTC" firstStartedPulling="2026-02-17 17:01:41.284068721 +0000 UTC m=+1169.041144065" lastFinishedPulling="2026-02-17 17:01:48.624932978 +0000 UTC m=+1176.382008312" observedRunningTime="2026-02-17 17:01:49.351648136 +0000 UTC m=+1177.108723460" watchObservedRunningTime="2026-02-17 17:01:49.353829759 +0000 UTC m=+1177.110905083" Feb 17 17:01:49 crc kubenswrapper[4694]: I0217 17:01:49.444467 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.435679178000001 podStartE2EDuration="20.444444873s" podCreationTimestamp="2026-02-17 17:01:29 +0000 UTC" firstStartedPulling="2026-02-17 17:01:41.348799607 +0000 UTC m=+1169.105874931" lastFinishedPulling="2026-02-17 17:01:47.357565302 +0000 UTC m=+1175.114640626" observedRunningTime="2026-02-17 17:01:49.438087617 +0000 UTC m=+1177.195162951" watchObservedRunningTime="2026-02-17 17:01:49.444444873 +0000 UTC m=+1177.201520197" Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.328102 4694 generic.go:334] "Generic (PLEG): container finished" podID="adef318b-03c0-4281-8b77-30b76a8904e6" containerID="5bbaeeb652857d08d1dffc26a1dadba4e60124f4943b3e81f2d1f5a7ac12725b" exitCode=0 Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.328176 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-q88vj" event={"ID":"adef318b-03c0-4281-8b77-30b76a8904e6","Type":"ContainerDied","Data":"5bbaeeb652857d08d1dffc26a1dadba4e60124f4943b3e81f2d1f5a7ac12725b"} Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.332930 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerStarted","Data":"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053"} Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.334706 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerStarted","Data":"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e"} Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.542770 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:01:50 crc kubenswrapper[4694]: I0217 17:01:50.599032 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.343491 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-q88vj" event={"ID":"adef318b-03c0-4281-8b77-30b76a8904e6","Type":"ContainerStarted","Data":"0dbc1e20685d806ec913ebea7d13e39a420046096b21f91aed86afd0d3361e40"} Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.343868 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.343889 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-q88vj" event={"ID":"adef318b-03c0-4281-8b77-30b76a8904e6","Type":"ContainerStarted","Data":"6556903b8ed64b6f5c0094c54388001f3a2a0d507dc59b32ee4b600990494bf9"} Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.345155 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"abf651b7-0b06-4b95-916e-e7fe6630d272","Type":"ContainerStarted","Data":"30fe648aac4afced2dfaaa59f3209a7839e1d96cff8f15afc6d616048b156dec"} Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.347431 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"391b1870-6558-40d0-be12-d31b3a57ed32","Type":"ContainerStarted","Data":"0c2a5bb7026d1213a744e17efca93bf926a6dfa613a7d4c42380c97b48d6119f"} Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.348068 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="dnsmasq-dns" containerID="cri-o://c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b" gracePeriod=10 Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.367489 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-q88vj" podStartSLOduration=10.396483972 podStartE2EDuration="16.367471185s" podCreationTimestamp="2026-02-17 17:01:35 +0000 UTC" firstStartedPulling="2026-02-17 17:01:42.018993811 +0000 UTC m=+1169.776069135" lastFinishedPulling="2026-02-17 17:01:47.989981024 +0000 UTC m=+1175.747056348" observedRunningTime="2026-02-17 17:01:51.361285332 +0000 UTC m=+1179.118360656" watchObservedRunningTime="2026-02-17 17:01:51.367471185 +0000 UTC m=+1179.124546499" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.386466 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.677492755 podStartE2EDuration="17.386448943s" podCreationTimestamp="2026-02-17 17:01:34 +0000 UTC" firstStartedPulling="2026-02-17 17:01:42.11997215 +0000 UTC m=+1169.877047474" lastFinishedPulling="2026-02-17 17:01:50.828928338 +0000 UTC m=+1178.586003662" observedRunningTime="2026-02-17 17:01:51.384084664 +0000 UTC m=+1179.141159988" watchObservedRunningTime="2026-02-17 17:01:51.386448943 +0000 UTC m=+1179.143524267" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.409629 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.991172143 podStartE2EDuration="14.409591113s" podCreationTimestamp="2026-02-17 17:01:37 +0000 UTC" firstStartedPulling="2026-02-17 17:01:41.417767387 +0000 UTC m=+1169.174842711" lastFinishedPulling="2026-02-17 17:01:50.836186367 +0000 UTC m=+1178.593261681" observedRunningTime="2026-02-17 17:01:51.401418992 +0000 UTC m=+1179.158494316" watchObservedRunningTime="2026-02-17 17:01:51.409591113 +0000 UTC m=+1179.166666447" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.713739 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.837712 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config\") pod \"e554efe9-fa59-446a-a376-a7339a08bf7c\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.837771 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbz5d\" (UniqueName: \"kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d\") pod \"e554efe9-fa59-446a-a376-a7339a08bf7c\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.838006 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc\") pod \"e554efe9-fa59-446a-a376-a7339a08bf7c\" (UID: \"e554efe9-fa59-446a-a376-a7339a08bf7c\") " Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.844812 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d" (OuterVolumeSpecName: "kube-api-access-qbz5d") pod "e554efe9-fa59-446a-a376-a7339a08bf7c" (UID: "e554efe9-fa59-446a-a376-a7339a08bf7c"). InnerVolumeSpecName "kube-api-access-qbz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.873967 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config" (OuterVolumeSpecName: "config") pod "e554efe9-fa59-446a-a376-a7339a08bf7c" (UID: "e554efe9-fa59-446a-a376-a7339a08bf7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.875957 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e554efe9-fa59-446a-a376-a7339a08bf7c" (UID: "e554efe9-fa59-446a-a376-a7339a08bf7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.939663 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbz5d\" (UniqueName: \"kubernetes.io/projected/e554efe9-fa59-446a-a376-a7339a08bf7c-kube-api-access-qbz5d\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.939703 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:51 crc kubenswrapper[4694]: I0217 17:01:51.939713 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e554efe9-fa59-446a-a376-a7339a08bf7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.357640 4694 generic.go:334] "Generic (PLEG): container finished" podID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerID="c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b" exitCode=0 Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.357754 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.357758 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" event={"ID":"e554efe9-fa59-446a-a376-a7339a08bf7c","Type":"ContainerDied","Data":"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b"} Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.358238 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-8plwp" event={"ID":"e554efe9-fa59-446a-a376-a7339a08bf7c","Type":"ContainerDied","Data":"6ea506d1ee1c0dadfb9b06a3a28b7c4aac7dc9ec533c81d4b08516ef8244d64c"} Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.358269 4694 scope.go:117] "RemoveContainer" containerID="c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.359187 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.391434 4694 scope.go:117] "RemoveContainer" containerID="2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.396784 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.405448 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-8plwp"] Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.432784 4694 scope.go:117] "RemoveContainer" containerID="c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b" Feb 17 17:01:52 crc kubenswrapper[4694]: E0217 17:01:52.433332 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b\": container with ID starting with c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b not found: ID does not exist" containerID="c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.433393 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b"} err="failed to get container status \"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b\": rpc error: code = NotFound desc = could not find container \"c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b\": container with ID starting with c411066d566c624196d228db3f2ce6e7ceee0a8425b5074d51163dc369c20d6b not found: ID does not exist" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.433433 4694 scope.go:117] "RemoveContainer" containerID="2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76" Feb 17 17:01:52 crc kubenswrapper[4694]: E0217 17:01:52.433951 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76\": container with ID starting with 2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76 not found: ID does not exist" containerID="2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.434053 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76"} err="failed to get container status \"2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76\": rpc error: code = NotFound desc = could not find container \"2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76\": container with ID starting with 2f00512e2b47caeb276964958e69de5fa37ee9a7fcfb52a9d98e434b1a68de76 not found: ID does not exist" Feb 17 17:01:52 crc kubenswrapper[4694]: I0217 17:01:52.910881 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" path="/var/lib/kubelet/pods/e554efe9-fa59-446a-a376-a7339a08bf7c/volumes" Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.374370 4694 generic.go:334] "Generic (PLEG): container finished" podID="a7cab10b-d837-44e6-81c7-8bfdb36a4d3c" containerID="52ba920e44ae807a9721919f1305c3ec54027076bac030bb31c59d9c1277687a" exitCode=0 Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.374427 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c","Type":"ContainerDied","Data":"52ba920e44ae807a9721919f1305c3ec54027076bac030bb31c59d9c1277687a"} Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.376799 4694 generic.go:334] "Generic (PLEG): container finished" podID="a0c805cf-d310-4594-8584-1061330e4c94" containerID="81541620d267325bb3505a8a61175d5bf5494eb53d9625d7b2d027136080e556" exitCode=0 Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.377744 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0c805cf-d310-4594-8584-1061330e4c94","Type":"ContainerDied","Data":"81541620d267325bb3505a8a61175d5bf5494eb53d9625d7b2d027136080e556"} Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.443106 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:53 crc kubenswrapper[4694]: I0217 17:01:53.515780 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.009031 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.009081 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.058357 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.391665 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a7cab10b-d837-44e6-81c7-8bfdb36a4d3c","Type":"ContainerStarted","Data":"e6bceb8094ba1aa97b207ea90b34a00319c59fe92b2cd675b38f9f9c3996b63a"} Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.395130 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0c805cf-d310-4594-8584-1061330e4c94","Type":"ContainerStarted","Data":"3e56ab6b2fcd120df2b71c92e6a1d554b2e46378e5815356c6f1dc3f96854cc3"} Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.396020 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.424397 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.812647808 podStartE2EDuration="27.424375253s" podCreationTimestamp="2026-02-17 17:01:27 +0000 UTC" firstStartedPulling="2026-02-17 17:01:40.833528213 +0000 UTC m=+1168.590603537" lastFinishedPulling="2026-02-17 17:01:48.445255638 +0000 UTC m=+1176.202330982" observedRunningTime="2026-02-17 17:01:54.417820531 +0000 UTC m=+1182.174895895" watchObservedRunningTime="2026-02-17 17:01:54.424375253 +0000 UTC m=+1182.181450577" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.441323 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.448598 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.282743538 podStartE2EDuration="28.44857903s" podCreationTimestamp="2026-02-17 17:01:26 +0000 UTC" firstStartedPulling="2026-02-17 17:01:40.82288446 +0000 UTC m=+1168.579959784" lastFinishedPulling="2026-02-17 17:01:47.988719922 +0000 UTC m=+1175.745795276" observedRunningTime="2026-02-17 17:01:54.441273899 +0000 UTC m=+1182.198349243" watchObservedRunningTime="2026-02-17 17:01:54.44857903 +0000 UTC m=+1182.205654354" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.451246 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.529125 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.638058 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:54 crc kubenswrapper[4694]: E0217 17:01:54.638467 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="dnsmasq-dns" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.638492 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="dnsmasq-dns" Feb 17 17:01:54 crc kubenswrapper[4694]: E0217 17:01:54.638549 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="init" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.638560 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="init" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.638756 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e554efe9-fa59-446a-a376-a7339a08bf7c" containerName="dnsmasq-dns" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.639639 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.648287 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.688939 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.723130 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86zht\" (UniqueName: \"kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.723405 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.723519 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.723660 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.733051 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xtmfm"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.734333 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.737382 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.745654 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xtmfm"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.825434 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovn-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.825801 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86zht\" (UniqueName: \"kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.825889 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826025 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fmh\" (UniqueName: \"kubernetes.io/projected/e1c89368-7041-4c84-8ba6-624d0f0b695e-kube-api-access-w8fmh\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826109 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826176 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-combined-ca-bundle\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826339 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c89368-7041-4c84-8ba6-624d0f0b695e-config\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826436 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovs-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826507 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.826587 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.827858 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.832664 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.834403 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.852810 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.859925 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.868407 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.868591 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-znx5z" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.868588 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86zht\" (UniqueName: \"kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht\") pod \"dnsmasq-dns-6bc7876d45-bqv2c\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.868712 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.869309 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.872735 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.928092 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovn-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.928216 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fmh\" (UniqueName: \"kubernetes.io/projected/e1c89368-7041-4c84-8ba6-624d0f0b695e-kube-api-access-w8fmh\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.928405 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovn-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.928545 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-combined-ca-bundle\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.928640 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c89368-7041-4c84-8ba6-624d0f0b695e-config\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.929346 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1c89368-7041-4c84-8ba6-624d0f0b695e-config\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.929417 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovs-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.929447 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.929512 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e1c89368-7041-4c84-8ba6-624d0f0b695e-ovs-rundir\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.936381 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.952528 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c89368-7041-4c84-8ba6-624d0f0b695e-combined-ca-bundle\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.977326 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fmh\" (UniqueName: \"kubernetes.io/projected/e1c89368-7041-4c84-8ba6-624d0f0b695e-kube-api-access-w8fmh\") pod \"ovn-controller-metrics-xtmfm\" (UID: \"e1c89368-7041-4c84-8ba6-624d0f0b695e\") " pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.978899 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:54 crc kubenswrapper[4694]: I0217 17:01:54.979557 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.003636 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.005140 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.025984 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.048325 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.049656 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.049820 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.049935 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtkr8\" (UniqueName: \"kubernetes.io/projected/10287785-ab79-4801-903b-0b4acdc8aca8-kube-api-access-qtkr8\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.057936 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-config\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.058098 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.058266 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.058401 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-scripts\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.058869 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xtmfm" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.160534 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.160882 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.160911 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-scripts\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.160961 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.160985 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161054 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161089 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxbf7\" (UniqueName: \"kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161117 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161148 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161185 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtkr8\" (UniqueName: \"kubernetes.io/projected/10287785-ab79-4801-903b-0b4acdc8aca8-kube-api-access-qtkr8\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161217 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-config\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.161252 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.165023 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.167355 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-scripts\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.170562 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10287785-ab79-4801-903b-0b4acdc8aca8-config\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.180355 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.181011 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.185443 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10287785-ab79-4801-903b-0b4acdc8aca8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.189576 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtkr8\" (UniqueName: \"kubernetes.io/projected/10287785-ab79-4801-903b-0b4acdc8aca8-kube-api-access-qtkr8\") pod \"ovn-northd-0\" (UID: \"10287785-ab79-4801-903b-0b4acdc8aca8\") " pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.243095 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.264160 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.264223 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxbf7\" (UniqueName: \"kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.264343 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.264407 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.264435 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.266419 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.267109 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.268081 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.270784 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.293546 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxbf7\" (UniqueName: \"kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7\") pod \"dnsmasq-dns-8554648995-bzvwc\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.396582 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.567334 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:55 crc kubenswrapper[4694]: W0217 17:01:55.569489 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5dfe16d_962d_4100_adac_dd22b0ea4df6.slice/crio-1a3614a68e2d5b7d984ec86fbab4c094f15fe0f41f7c134fcb0e1a4f51325d64 WatchSource:0}: Error finding container 1a3614a68e2d5b7d984ec86fbab4c094f15fe0f41f7c134fcb0e1a4f51325d64: Status 404 returned error can't find the container with id 1a3614a68e2d5b7d984ec86fbab4c094f15fe0f41f7c134fcb0e1a4f51325d64 Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.725165 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xtmfm"] Feb 17 17:01:55 crc kubenswrapper[4694]: W0217 17:01:55.729184 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c89368_7041_4c84_8ba6_624d0f0b695e.slice/crio-503efbdd59c3c1c6e03b899b951209fec3a2b46f2048acf7d021197dea8e0933 WatchSource:0}: Error finding container 503efbdd59c3c1c6e03b899b951209fec3a2b46f2048acf7d021197dea8e0933: Status 404 returned error can't find the container with id 503efbdd59c3c1c6e03b899b951209fec3a2b46f2048acf7d021197dea8e0933 Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.868975 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 17:01:55 crc kubenswrapper[4694]: I0217 17:01:55.885352 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:01:55 crc kubenswrapper[4694]: W0217 17:01:55.911215 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b7a020_a821_4443_bfa6_0015ec59195c.slice/crio-ac0f048b1f15d1abb69546d0fe29de3b1ef93bb7030c65b02a65643df6e8090c WatchSource:0}: Error finding container ac0f048b1f15d1abb69546d0fe29de3b1ef93bb7030c65b02a65643df6e8090c: Status 404 returned error can't find the container with id ac0f048b1f15d1abb69546d0fe29de3b1ef93bb7030c65b02a65643df6e8090c Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.416211 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xtmfm" event={"ID":"e1c89368-7041-4c84-8ba6-624d0f0b695e","Type":"ContainerStarted","Data":"6c35e632ff3ebb46aab84512b32d20d8292cf5221b1a3e861f585f49f53dcb60"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.416743 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xtmfm" event={"ID":"e1c89368-7041-4c84-8ba6-624d0f0b695e","Type":"ContainerStarted","Data":"503efbdd59c3c1c6e03b899b951209fec3a2b46f2048acf7d021197dea8e0933"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.419186 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10287785-ab79-4801-903b-0b4acdc8aca8","Type":"ContainerStarted","Data":"015a265ca76c13b8e62acc15e20dbad372ba1dadb7da95a993edeb945eeed101"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.421696 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerID="d731a433bb1fa91dcf9930f517c662cf063ae7124b7376c8c2236bb38d9b5bb6" exitCode=0 Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.421801 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bzvwc" event={"ID":"f2b7a020-a821-4443-bfa6-0015ec59195c","Type":"ContainerDied","Data":"d731a433bb1fa91dcf9930f517c662cf063ae7124b7376c8c2236bb38d9b5bb6"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.421909 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bzvwc" event={"ID":"f2b7a020-a821-4443-bfa6-0015ec59195c","Type":"ContainerStarted","Data":"ac0f048b1f15d1abb69546d0fe29de3b1ef93bb7030c65b02a65643df6e8090c"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.424266 4694 generic.go:334] "Generic (PLEG): container finished" podID="d5dfe16d-962d-4100-adac-dd22b0ea4df6" containerID="b46a93bd832fe0bab4ee03f3476e9f1e201bf88d34cb465e1848f687c28f3242" exitCode=0 Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.424547 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" event={"ID":"d5dfe16d-962d-4100-adac-dd22b0ea4df6","Type":"ContainerDied","Data":"b46a93bd832fe0bab4ee03f3476e9f1e201bf88d34cb465e1848f687c28f3242"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.424593 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" event={"ID":"d5dfe16d-962d-4100-adac-dd22b0ea4df6","Type":"ContainerStarted","Data":"1a3614a68e2d5b7d984ec86fbab4c094f15fe0f41f7c134fcb0e1a4f51325d64"} Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.452178 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xtmfm" podStartSLOduration=2.452017974 podStartE2EDuration="2.452017974s" podCreationTimestamp="2026-02-17 17:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:01:56.450787883 +0000 UTC m=+1184.207863247" watchObservedRunningTime="2026-02-17 17:01:56.452017974 +0000 UTC m=+1184.209093308" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.758136 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.842493 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config\") pod \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.842535 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86zht\" (UniqueName: \"kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht\") pod \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.842602 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb\") pod \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.842663 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc\") pod \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\" (UID: \"d5dfe16d-962d-4100-adac-dd22b0ea4df6\") " Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.848203 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht" (OuterVolumeSpecName: "kube-api-access-86zht") pod "d5dfe16d-962d-4100-adac-dd22b0ea4df6" (UID: "d5dfe16d-962d-4100-adac-dd22b0ea4df6"). InnerVolumeSpecName "kube-api-access-86zht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.860757 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5dfe16d-962d-4100-adac-dd22b0ea4df6" (UID: "d5dfe16d-962d-4100-adac-dd22b0ea4df6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.861149 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config" (OuterVolumeSpecName: "config") pod "d5dfe16d-962d-4100-adac-dd22b0ea4df6" (UID: "d5dfe16d-962d-4100-adac-dd22b0ea4df6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.861252 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5dfe16d-962d-4100-adac-dd22b0ea4df6" (UID: "d5dfe16d-962d-4100-adac-dd22b0ea4df6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.945345 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.945393 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.945411 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86zht\" (UniqueName: \"kubernetes.io/projected/d5dfe16d-962d-4100-adac-dd22b0ea4df6-kube-api-access-86zht\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:56 crc kubenswrapper[4694]: I0217 17:01:56.945429 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5dfe16d-962d-4100-adac-dd22b0ea4df6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.435861 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.435873 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-bqv2c" event={"ID":"d5dfe16d-962d-4100-adac-dd22b0ea4df6","Type":"ContainerDied","Data":"1a3614a68e2d5b7d984ec86fbab4c094f15fe0f41f7c134fcb0e1a4f51325d64"} Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.436623 4694 scope.go:117] "RemoveContainer" containerID="b46a93bd832fe0bab4ee03f3476e9f1e201bf88d34cb465e1848f687c28f3242" Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.438064 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10287785-ab79-4801-903b-0b4acdc8aca8","Type":"ContainerStarted","Data":"36cdd68fc16d4e914e39cfed71f31c84f52c5c3c7447b529e657b4a0a1985f5f"} Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.442219 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bzvwc" event={"ID":"f2b7a020-a821-4443-bfa6-0015ec59195c","Type":"ContainerStarted","Data":"6316fe2ff56bdeadb4288807a0d03b6aca3855c82f31a845a0e7dc0c6150a12a"} Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.442324 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.476297 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.489021 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-bqv2c"] Feb 17 17:01:57 crc kubenswrapper[4694]: I0217 17:01:57.507440 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-bzvwc" podStartSLOduration=3.5074248150000003 podStartE2EDuration="3.507424815s" podCreationTimestamp="2026-02-17 17:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:01:57.498482624 +0000 UTC m=+1185.255557948" watchObservedRunningTime="2026-02-17 17:01:57.507424815 +0000 UTC m=+1185.264500139" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.159858 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.160196 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.239806 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.451267 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"10287785-ab79-4801-903b-0b4acdc8aca8","Type":"ContainerStarted","Data":"391b99b747d231bb088eda13a55048266473a185c8caf2c7cf88fd304ca3a45c"} Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.451542 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.478263 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.223194537 podStartE2EDuration="4.478227149s" podCreationTimestamp="2026-02-17 17:01:54 +0000 UTC" firstStartedPulling="2026-02-17 17:01:55.899474051 +0000 UTC m=+1183.656549375" lastFinishedPulling="2026-02-17 17:01:57.154506653 +0000 UTC m=+1184.911581987" observedRunningTime="2026-02-17 17:01:58.469977966 +0000 UTC m=+1186.227053310" watchObservedRunningTime="2026-02-17 17:01:58.478227149 +0000 UTC m=+1186.235302473" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.522366 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 17:01:58 crc kubenswrapper[4694]: I0217 17:01:58.903840 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dfe16d-962d-4100-adac-dd22b0ea4df6" path="/var/lib/kubelet/pods/d5dfe16d-962d-4100-adac-dd22b0ea4df6/volumes" Feb 17 17:01:59 crc kubenswrapper[4694]: I0217 17:01:59.246368 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 17:01:59 crc kubenswrapper[4694]: I0217 17:01:59.246767 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.718576 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f1bb-account-create-update-v4tm7"] Feb 17 17:02:00 crc kubenswrapper[4694]: E0217 17:02:00.718930 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dfe16d-962d-4100-adac-dd22b0ea4df6" containerName="init" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.718943 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dfe16d-962d-4100-adac-dd22b0ea4df6" containerName="init" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.719090 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dfe16d-962d-4100-adac-dd22b0ea4df6" containerName="init" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.719559 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.722304 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.736905 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f1bb-account-create-update-v4tm7"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.764536 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g4wjz"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.765900 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.772193 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g4wjz"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.805274 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkxx\" (UniqueName: \"kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.805466 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.873145 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xtd7t"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.874347 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.883411 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xtd7t"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.907360 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkxx\" (UniqueName: \"kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.907410 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.907500 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.907528 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4jn\" (UniqueName: \"kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.908592 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.930574 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkxx\" (UniqueName: \"kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx\") pod \"keystone-f1bb-account-create-update-v4tm7\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.936807 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db13-account-create-update-qrdw6"] Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.938144 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.944994 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 17:02:00 crc kubenswrapper[4694]: I0217 17:02:00.948392 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db13-account-create-update-qrdw6"] Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.009709 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfzv\" (UniqueName: \"kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.010236 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.010285 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.010318 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zq97\" (UniqueName: \"kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.010482 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4jn\" (UniqueName: \"kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.010528 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.011228 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.025214 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4jn\" (UniqueName: \"kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn\") pod \"keystone-db-create-g4wjz\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.057667 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.086364 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.112484 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.112549 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zq97\" (UniqueName: \"kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.112659 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.112719 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfzv\" (UniqueName: \"kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.113310 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.113936 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.129797 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zq97\" (UniqueName: \"kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97\") pod \"placement-db-create-xtd7t\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.130923 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfzv\" (UniqueName: \"kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv\") pod \"placement-db13-account-create-update-qrdw6\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.187214 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.276114 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.510241 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f1bb-account-create-update-v4tm7"] Feb 17 17:02:01 crc kubenswrapper[4694]: W0217 17:02:01.511450 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b48ab6_ed97_4844_bcf5_126f60c9b9a3.slice/crio-5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4 WatchSource:0}: Error finding container 5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4: Status 404 returned error can't find the container with id 5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4 Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.529570 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.581344 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g4wjz"] Feb 17 17:02:01 crc kubenswrapper[4694]: W0217 17:02:01.582555 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod874f9975_fa4a_41b7_9aee_2010fb88447f.slice/crio-491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33 WatchSource:0}: Error finding container 491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33: Status 404 returned error can't find the container with id 491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33 Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.603763 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.670187 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xtd7t"] Feb 17 17:02:01 crc kubenswrapper[4694]: W0217 17:02:01.681012 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d0023fe_7368_42ed_a175_487cd538b39e.slice/crio-842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b WatchSource:0}: Error finding container 842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b: Status 404 returned error can't find the container with id 842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.785259 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db13-account-create-update-qrdw6"] Feb 17 17:02:01 crc kubenswrapper[4694]: W0217 17:02:01.790047 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84b858a6_af85_4112_a8bf_1ed44b0004e7.slice/crio-65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb WatchSource:0}: Error finding container 65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb: Status 404 returned error can't find the container with id 65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.885896 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.991149 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.991368 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-bzvwc" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="dnsmasq-dns" containerID="cri-o://6316fe2ff56bdeadb4288807a0d03b6aca3855c82f31a845a0e7dc0c6150a12a" gracePeriod=10 Feb 17 17:02:01 crc kubenswrapper[4694]: I0217 17:02:01.997754 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.102763 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.104420 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.127381 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.238105 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.238156 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdtj\" (UniqueName: \"kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.238214 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.238464 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.238528 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.340275 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdtj\" (UniqueName: \"kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.340423 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.341777 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.341925 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.341994 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.342807 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.343345 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.343525 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.343654 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.358080 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdtj\" (UniqueName: \"kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj\") pod \"dnsmasq-dns-b8fbc5445-8lqnf\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.441970 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.492294 4694 generic.go:334] "Generic (PLEG): container finished" podID="b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" containerID="e3742b0fad3479cfb4be13b484d19a01a24540b2f5f3c93e8b6eb8e6433fec80" exitCode=0 Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.492369 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1bb-account-create-update-v4tm7" event={"ID":"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3","Type":"ContainerDied","Data":"e3742b0fad3479cfb4be13b484d19a01a24540b2f5f3c93e8b6eb8e6433fec80"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.492395 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1bb-account-create-update-v4tm7" event={"ID":"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3","Type":"ContainerStarted","Data":"5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.494455 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerID="6316fe2ff56bdeadb4288807a0d03b6aca3855c82f31a845a0e7dc0c6150a12a" exitCode=0 Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.494560 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bzvwc" event={"ID":"f2b7a020-a821-4443-bfa6-0015ec59195c","Type":"ContainerDied","Data":"6316fe2ff56bdeadb4288807a0d03b6aca3855c82f31a845a0e7dc0c6150a12a"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.498321 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db13-account-create-update-qrdw6" event={"ID":"84b858a6-af85-4112-a8bf-1ed44b0004e7","Type":"ContainerStarted","Data":"3423ecb763c09df03daad0a422cd33e28707ae6d198eba5a1c2b046e8df06eeb"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.498390 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db13-account-create-update-qrdw6" event={"ID":"84b858a6-af85-4112-a8bf-1ed44b0004e7","Type":"ContainerStarted","Data":"65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.501250 4694 generic.go:334] "Generic (PLEG): container finished" podID="874f9975-fa4a-41b7-9aee-2010fb88447f" containerID="3ab4c2cee84d3c79918f010d3921fed97bbece35e9d8f9503eac949ac85df9d9" exitCode=0 Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.501381 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4wjz" event={"ID":"874f9975-fa4a-41b7-9aee-2010fb88447f","Type":"ContainerDied","Data":"3ab4c2cee84d3c79918f010d3921fed97bbece35e9d8f9503eac949ac85df9d9"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.501411 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4wjz" event={"ID":"874f9975-fa4a-41b7-9aee-2010fb88447f","Type":"ContainerStarted","Data":"491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.503735 4694 generic.go:334] "Generic (PLEG): container finished" podID="1d0023fe-7368-42ed-a175-487cd538b39e" containerID="750d66418942f40339f05c0fa3b216d6baaec3352af5b871e96d1a2458362721" exitCode=0 Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.503823 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xtd7t" event={"ID":"1d0023fe-7368-42ed-a175-487cd538b39e","Type":"ContainerDied","Data":"750d66418942f40339f05c0fa3b216d6baaec3352af5b871e96d1a2458362721"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.503894 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xtd7t" event={"ID":"1d0023fe-7368-42ed-a175-487cd538b39e","Type":"ContainerStarted","Data":"842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b"} Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.559459 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db13-account-create-update-qrdw6" podStartSLOduration=2.559438551 podStartE2EDuration="2.559438551s" podCreationTimestamp="2026-02-17 17:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:02.554995931 +0000 UTC m=+1190.312071245" watchObservedRunningTime="2026-02-17 17:02:02.559438551 +0000 UTC m=+1190.316513875" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.652237 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.747264 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc\") pod \"f2b7a020-a821-4443-bfa6-0015ec59195c\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.747337 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config\") pod \"f2b7a020-a821-4443-bfa6-0015ec59195c\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.747418 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxbf7\" (UniqueName: \"kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7\") pod \"f2b7a020-a821-4443-bfa6-0015ec59195c\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.747498 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb\") pod \"f2b7a020-a821-4443-bfa6-0015ec59195c\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.747578 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb\") pod \"f2b7a020-a821-4443-bfa6-0015ec59195c\" (UID: \"f2b7a020-a821-4443-bfa6-0015ec59195c\") " Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.758823 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7" (OuterVolumeSpecName: "kube-api-access-rxbf7") pod "f2b7a020-a821-4443-bfa6-0015ec59195c" (UID: "f2b7a020-a821-4443-bfa6-0015ec59195c"). InnerVolumeSpecName "kube-api-access-rxbf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.785626 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2b7a020-a821-4443-bfa6-0015ec59195c" (UID: "f2b7a020-a821-4443-bfa6-0015ec59195c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.786850 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2b7a020-a821-4443-bfa6-0015ec59195c" (UID: "f2b7a020-a821-4443-bfa6-0015ec59195c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.788703 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2b7a020-a821-4443-bfa6-0015ec59195c" (UID: "f2b7a020-a821-4443-bfa6-0015ec59195c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.801299 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config" (OuterVolumeSpecName: "config") pod "f2b7a020-a821-4443-bfa6-0015ec59195c" (UID: "f2b7a020-a821-4443-bfa6-0015ec59195c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.850261 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxbf7\" (UniqueName: \"kubernetes.io/projected/f2b7a020-a821-4443-bfa6-0015ec59195c-kube-api-access-rxbf7\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.850643 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.850791 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.850914 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.851048 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2b7a020-a821-4443-bfa6-0015ec59195c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:02 crc kubenswrapper[4694]: I0217 17:02:02.945757 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:02 crc kubenswrapper[4694]: W0217 17:02:02.951746 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd132a1_e4c4_4588_a436_daa27b4a1a98.slice/crio-4ac4702334ddd7f77bc6f5340f0a8f296d465bdee469dca069110bc2a8cee71b WatchSource:0}: Error finding container 4ac4702334ddd7f77bc6f5340f0a8f296d465bdee469dca069110bc2a8cee71b: Status 404 returned error can't find the container with id 4ac4702334ddd7f77bc6f5340f0a8f296d465bdee469dca069110bc2a8cee71b Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.213444 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.213782 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="dnsmasq-dns" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.213799 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="dnsmasq-dns" Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.213814 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="init" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.213822 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="init" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.213978 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" containerName="dnsmasq-dns" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.219177 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.220858 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.221165 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fwl42" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.221313 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.222524 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.242318 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.260843 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.261107 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852493c7-97b4-4850-9ef3-44ec598d9d1a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.261183 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbqgq\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-kube-api-access-rbqgq\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.261284 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-cache\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.261359 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.261427 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-lock\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.383829 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-cache\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.383882 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.383906 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-lock\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384040 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384069 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852493c7-97b4-4850-9ef3-44ec598d9d1a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384120 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbqgq\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-kube-api-access-rbqgq\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.384156 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.384203 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.384289 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:03.884256976 +0000 UTC m=+1191.641332340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384424 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-cache\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384497 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/852493c7-97b4-4850-9ef3-44ec598d9d1a-lock\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.384957 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.389295 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852493c7-97b4-4850-9ef3-44ec598d9d1a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.406898 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.409275 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbqgq\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-kube-api-access-rbqgq\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.514977 4694 generic.go:334] "Generic (PLEG): container finished" podID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerID="54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc" exitCode=0 Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.515043 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" event={"ID":"2cd132a1-e4c4-4588-a436-daa27b4a1a98","Type":"ContainerDied","Data":"54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc"} Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.515070 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" event={"ID":"2cd132a1-e4c4-4588-a436-daa27b4a1a98","Type":"ContainerStarted","Data":"4ac4702334ddd7f77bc6f5340f0a8f296d465bdee469dca069110bc2a8cee71b"} Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.518457 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bzvwc" event={"ID":"f2b7a020-a821-4443-bfa6-0015ec59195c","Type":"ContainerDied","Data":"ac0f048b1f15d1abb69546d0fe29de3b1ef93bb7030c65b02a65643df6e8090c"} Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.518520 4694 scope.go:117] "RemoveContainer" containerID="6316fe2ff56bdeadb4288807a0d03b6aca3855c82f31a845a0e7dc0c6150a12a" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.518771 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bzvwc" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.523637 4694 generic.go:334] "Generic (PLEG): container finished" podID="84b858a6-af85-4112-a8bf-1ed44b0004e7" containerID="3423ecb763c09df03daad0a422cd33e28707ae6d198eba5a1c2b046e8df06eeb" exitCode=0 Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.523680 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db13-account-create-update-qrdw6" event={"ID":"84b858a6-af85-4112-a8bf-1ed44b0004e7","Type":"ContainerDied","Data":"3423ecb763c09df03daad0a422cd33e28707ae6d198eba5a1c2b046e8df06eeb"} Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.596208 4694 scope.go:117] "RemoveContainer" containerID="d731a433bb1fa91dcf9930f517c662cf063ae7124b7376c8c2236bb38d9b5bb6" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.603801 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.611980 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bzvwc"] Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.901910 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:03 crc kubenswrapper[4694]: I0217 17:02:03.923230 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.923487 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.923501 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:03 crc kubenswrapper[4694]: E0217 17:02:03.923548 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:04.923530682 +0000 UTC m=+1192.680606006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.024008 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts\") pod \"1d0023fe-7368-42ed-a175-487cd538b39e\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.024484 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zq97\" (UniqueName: \"kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97\") pod \"1d0023fe-7368-42ed-a175-487cd538b39e\" (UID: \"1d0023fe-7368-42ed-a175-487cd538b39e\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.025202 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d0023fe-7368-42ed-a175-487cd538b39e" (UID: "1d0023fe-7368-42ed-a175-487cd538b39e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.029763 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97" (OuterVolumeSpecName: "kube-api-access-9zq97") pod "1d0023fe-7368-42ed-a175-487cd538b39e" (UID: "1d0023fe-7368-42ed-a175-487cd538b39e"). InnerVolumeSpecName "kube-api-access-9zq97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.086895 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.095525 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.126647 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d0023fe-7368-42ed-a175-487cd538b39e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.126681 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zq97\" (UniqueName: \"kubernetes.io/projected/1d0023fe-7368-42ed-a175-487cd538b39e-kube-api-access-9zq97\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.228062 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts\") pod \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.228169 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjkxx\" (UniqueName: \"kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx\") pod \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\" (UID: \"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.228579 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" (UID: "b7b48ab6-ed97-4844-bcf5-126f60c9b9a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.228651 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts\") pod \"874f9975-fa4a-41b7-9aee-2010fb88447f\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.228765 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4jn\" (UniqueName: \"kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn\") pod \"874f9975-fa4a-41b7-9aee-2010fb88447f\" (UID: \"874f9975-fa4a-41b7-9aee-2010fb88447f\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.229116 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "874f9975-fa4a-41b7-9aee-2010fb88447f" (UID: "874f9975-fa4a-41b7-9aee-2010fb88447f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.229304 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874f9975-fa4a-41b7-9aee-2010fb88447f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.229326 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.230766 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx" (OuterVolumeSpecName: "kube-api-access-jjkxx") pod "b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" (UID: "b7b48ab6-ed97-4844-bcf5-126f60c9b9a3"). InnerVolumeSpecName "kube-api-access-jjkxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.232582 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn" (OuterVolumeSpecName: "kube-api-access-gd4jn") pod "874f9975-fa4a-41b7-9aee-2010fb88447f" (UID: "874f9975-fa4a-41b7-9aee-2010fb88447f"). InnerVolumeSpecName "kube-api-access-gd4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.331376 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4jn\" (UniqueName: \"kubernetes.io/projected/874f9975-fa4a-41b7-9aee-2010fb88447f-kube-api-access-gd4jn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.331413 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjkxx\" (UniqueName: \"kubernetes.io/projected/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3-kube-api-access-jjkxx\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.531543 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f1bb-account-create-update-v4tm7" event={"ID":"b7b48ab6-ed97-4844-bcf5-126f60c9b9a3","Type":"ContainerDied","Data":"5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4"} Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.531595 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0993ad684162bb954bb2868fa0aadb6aee5da671cac650f38fa21a256050b4" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.531599 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f1bb-account-create-update-v4tm7" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.533357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" event={"ID":"2cd132a1-e4c4-4588-a436-daa27b4a1a98","Type":"ContainerStarted","Data":"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790"} Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.533551 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.541715 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g4wjz" event={"ID":"874f9975-fa4a-41b7-9aee-2010fb88447f","Type":"ContainerDied","Data":"491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33"} Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.541769 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491ed03648aa5ae83b34764b6e29fbb89513d6a39c1b66711fcde34f5358aa33" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.541797 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g4wjz" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.546277 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xtd7t" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.546469 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xtd7t" event={"ID":"1d0023fe-7368-42ed-a175-487cd538b39e","Type":"ContainerDied","Data":"842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b"} Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.546695 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842929c1f2e085f0a6e2b8afcbdfcbdd1a903c477b90b45b7b5cba2e904e357b" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.793329 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" podStartSLOduration=2.793311467 podStartE2EDuration="2.793311467s" podCreationTimestamp="2026-02-17 17:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:04.55493841 +0000 UTC m=+1192.312013734" watchObservedRunningTime="2026-02-17 17:02:04.793311467 +0000 UTC m=+1192.550386791" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.794060 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7n5vf"] Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.794422 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874f9975-fa4a-41b7-9aee-2010fb88447f" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.795913 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="874f9975-fa4a-41b7-9aee-2010fb88447f" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.796048 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.796057 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.796163 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0023fe-7368-42ed-a175-487cd538b39e" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.796171 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0023fe-7368-42ed-a175-487cd538b39e" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.797162 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0023fe-7368-42ed-a175-487cd538b39e" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.797181 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="874f9975-fa4a-41b7-9aee-2010fb88447f" containerName="mariadb-database-create" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.797189 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.797907 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.812726 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7n5vf"] Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.857714 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.861034 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6087-account-create-update-v6hqf"] Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.862757 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b858a6-af85-4112-a8bf-1ed44b0004e7" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.862782 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b858a6-af85-4112-a8bf-1ed44b0004e7" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.863013 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b858a6-af85-4112-a8bf-1ed44b0004e7" containerName="mariadb-account-create-update" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.863735 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.865687 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.872993 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6087-account-create-update-v6hqf"] Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.907058 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b7a020-a821-4443-bfa6-0015ec59195c" path="/var/lib/kubelet/pods/f2b7a020-a821-4443-bfa6-0015ec59195c/volumes" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.943356 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts\") pod \"84b858a6-af85-4112-a8bf-1ed44b0004e7\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.943579 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppfzv\" (UniqueName: \"kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv\") pod \"84b858a6-af85-4112-a8bf-1ed44b0004e7\" (UID: \"84b858a6-af85-4112-a8bf-1ed44b0004e7\") " Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.943892 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84b858a6-af85-4112-a8bf-1ed44b0004e7" (UID: "84b858a6-af85-4112-a8bf-1ed44b0004e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.943943 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjqs\" (UniqueName: \"kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.943986 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.944026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptm2w\" (UniqueName: \"kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.944063 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.944123 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.944139 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:04 crc kubenswrapper[4694]: E0217 17:02:04.944185 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:06.944168516 +0000 UTC m=+1194.701243840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.944255 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.944436 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b858a6-af85-4112-a8bf-1ed44b0004e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:04 crc kubenswrapper[4694]: I0217 17:02:04.946731 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv" (OuterVolumeSpecName: "kube-api-access-ppfzv") pod "84b858a6-af85-4112-a8bf-1ed44b0004e7" (UID: "84b858a6-af85-4112-a8bf-1ed44b0004e7"). InnerVolumeSpecName "kube-api-access-ppfzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.046148 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.046283 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjqs\" (UniqueName: \"kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.046323 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptm2w\" (UniqueName: \"kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.046367 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.046407 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppfzv\" (UniqueName: \"kubernetes.io/projected/84b858a6-af85-4112-a8bf-1ed44b0004e7-kube-api-access-ppfzv\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.047040 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.047447 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.063302 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjqs\" (UniqueName: \"kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs\") pod \"glance-db-create-7n5vf\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.063302 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptm2w\" (UniqueName: \"kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w\") pod \"glance-6087-account-create-update-v6hqf\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.115399 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.179879 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.540142 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7n5vf"] Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.554825 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db13-account-create-update-qrdw6" event={"ID":"84b858a6-af85-4112-a8bf-1ed44b0004e7","Type":"ContainerDied","Data":"65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb"} Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.554860 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db13-account-create-update-qrdw6" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.554879 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65fe8c0ffcb72504365153a1e070ce5d83426fd7316a5747f2462f23285875bb" Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.555956 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7n5vf" event={"ID":"e1f94c8d-bdff-499c-8e4f-1c5a022f328f","Type":"ContainerStarted","Data":"fb53144ebb0a2bb1e4c80b0a1b56c0216da60c3f6ee16ca27f337dfba3228a37"} Feb 17 17:02:05 crc kubenswrapper[4694]: I0217 17:02:05.659157 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6087-account-create-update-v6hqf"] Feb 17 17:02:05 crc kubenswrapper[4694]: W0217 17:02:05.662436 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79a2e4cd_27d5_4b34_8b28_61db84dafc59.slice/crio-d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82 WatchSource:0}: Error finding container d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82: Status 404 returned error can't find the container with id d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82 Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.456359 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wl8mg"] Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.459369 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.461364 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.469513 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wl8mg"] Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.565664 4694 generic.go:334] "Generic (PLEG): container finished" podID="79a2e4cd-27d5-4b34-8b28-61db84dafc59" containerID="ceb990be6949e681bb1fea1b62505171ada5b24ced52ef07719bd8ff02456cbc" exitCode=0 Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.565735 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6087-account-create-update-v6hqf" event={"ID":"79a2e4cd-27d5-4b34-8b28-61db84dafc59","Type":"ContainerDied","Data":"ceb990be6949e681bb1fea1b62505171ada5b24ced52ef07719bd8ff02456cbc"} Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.565762 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6087-account-create-update-v6hqf" event={"ID":"79a2e4cd-27d5-4b34-8b28-61db84dafc59","Type":"ContainerStarted","Data":"d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82"} Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.567030 4694 generic.go:334] "Generic (PLEG): container finished" podID="e1f94c8d-bdff-499c-8e4f-1c5a022f328f" containerID="98162195a19395ea4a0720118004d57cbaa07b39b01083f919adf7834060a6f8" exitCode=0 Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.567055 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7n5vf" event={"ID":"e1f94c8d-bdff-499c-8e4f-1c5a022f328f","Type":"ContainerDied","Data":"98162195a19395ea4a0720118004d57cbaa07b39b01083f919adf7834060a6f8"} Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.571061 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.571106 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7p6\" (UniqueName: \"kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.672425 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.672509 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7p6\" (UniqueName: \"kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.674713 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.708294 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7p6\" (UniqueName: \"kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6\") pod \"root-account-create-update-wl8mg\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.789281 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:06 crc kubenswrapper[4694]: I0217 17:02:06.977668 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:06 crc kubenswrapper[4694]: E0217 17:02:06.978005 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:06 crc kubenswrapper[4694]: E0217 17:02:06.978042 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:06 crc kubenswrapper[4694]: E0217 17:02:06.978110 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:10.978086042 +0000 UTC m=+1198.735161376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.081223 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-q6jc9"] Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.082500 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.086945 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.087645 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.087817 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.094722 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q6jc9"] Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181121 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181310 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181427 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181490 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181518 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srkj\" (UniqueName: \"kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181575 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.181695 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.239848 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wl8mg"] Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283709 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283788 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283829 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283850 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283872 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srkj\" (UniqueName: \"kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283908 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.283949 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.284679 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.284929 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.285104 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.291516 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.295310 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.300869 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.306055 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srkj\" (UniqueName: \"kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj\") pod \"swift-ring-rebalance-q6jc9\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.406912 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.579931 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wl8mg" event={"ID":"02c3f4af-5b32-4e7f-a562-6fd529a1abaf","Type":"ContainerStarted","Data":"08807e28790f483219c90218f1222e272851656dca7c0fc037261c614e606669"} Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.580164 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wl8mg" event={"ID":"02c3f4af-5b32-4e7f-a562-6fd529a1abaf","Type":"ContainerStarted","Data":"0bdc25861f9ec39c887b402e125b00357a679ed32beb2ff2fcab802b98d0e1a7"} Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.598043 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-wl8mg" podStartSLOduration=1.5980206460000002 podStartE2EDuration="1.598020646s" podCreationTimestamp="2026-02-17 17:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:07.593358951 +0000 UTC m=+1195.350434275" watchObservedRunningTime="2026-02-17 17:02:07.598020646 +0000 UTC m=+1195.355095970" Feb 17 17:02:07 crc kubenswrapper[4694]: W0217 17:02:07.861969 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78214f4f_59b9_4ed5_bd7c_8fb4560c6ef3.slice/crio-3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448 WatchSource:0}: Error finding container 3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448: Status 404 returned error can't find the container with id 3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448 Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.863940 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q6jc9"] Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.912592 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:07 crc kubenswrapper[4694]: I0217 17:02:07.961944 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.003164 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmjqs\" (UniqueName: \"kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs\") pod \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.003217 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts\") pod \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\" (UID: \"e1f94c8d-bdff-499c-8e4f-1c5a022f328f\") " Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.010027 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs" (OuterVolumeSpecName: "kube-api-access-xmjqs") pod "e1f94c8d-bdff-499c-8e4f-1c5a022f328f" (UID: "e1f94c8d-bdff-499c-8e4f-1c5a022f328f"). InnerVolumeSpecName "kube-api-access-xmjqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.010391 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1f94c8d-bdff-499c-8e4f-1c5a022f328f" (UID: "e1f94c8d-bdff-499c-8e4f-1c5a022f328f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.104734 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts\") pod \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.104867 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptm2w\" (UniqueName: \"kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w\") pod \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\" (UID: \"79a2e4cd-27d5-4b34-8b28-61db84dafc59\") " Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.105360 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmjqs\" (UniqueName: \"kubernetes.io/projected/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-kube-api-access-xmjqs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.105382 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1f94c8d-bdff-499c-8e4f-1c5a022f328f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.106223 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79a2e4cd-27d5-4b34-8b28-61db84dafc59" (UID: "79a2e4cd-27d5-4b34-8b28-61db84dafc59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.109455 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w" (OuterVolumeSpecName: "kube-api-access-ptm2w") pod "79a2e4cd-27d5-4b34-8b28-61db84dafc59" (UID: "79a2e4cd-27d5-4b34-8b28-61db84dafc59"). InnerVolumeSpecName "kube-api-access-ptm2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.206684 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptm2w\" (UniqueName: \"kubernetes.io/projected/79a2e4cd-27d5-4b34-8b28-61db84dafc59-kube-api-access-ptm2w\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.207001 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79a2e4cd-27d5-4b34-8b28-61db84dafc59-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.590429 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q6jc9" event={"ID":"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3","Type":"ContainerStarted","Data":"3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448"} Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.592065 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6087-account-create-update-v6hqf" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.592081 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6087-account-create-update-v6hqf" event={"ID":"79a2e4cd-27d5-4b34-8b28-61db84dafc59","Type":"ContainerDied","Data":"d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82"} Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.592112 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8694e16a5ee2ada4fc7efebe8c139eb60d4aaa2d5ae0e1dd93c2afe81809b82" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.593661 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7n5vf" event={"ID":"e1f94c8d-bdff-499c-8e4f-1c5a022f328f","Type":"ContainerDied","Data":"fb53144ebb0a2bb1e4c80b0a1b56c0216da60c3f6ee16ca27f337dfba3228a37"} Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.593681 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7n5vf" Feb 17 17:02:08 crc kubenswrapper[4694]: I0217 17:02:08.593690 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb53144ebb0a2bb1e4c80b0a1b56c0216da60c3f6ee16ca27f337dfba3228a37" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.603271 4694 generic.go:334] "Generic (PLEG): container finished" podID="02c3f4af-5b32-4e7f-a562-6fd529a1abaf" containerID="08807e28790f483219c90218f1222e272851656dca7c0fc037261c614e606669" exitCode=0 Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.603365 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wl8mg" event={"ID":"02c3f4af-5b32-4e7f-a562-6fd529a1abaf","Type":"ContainerDied","Data":"08807e28790f483219c90218f1222e272851656dca7c0fc037261c614e606669"} Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.945884 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qkvkc"] Feb 17 17:02:09 crc kubenswrapper[4694]: E0217 17:02:09.946360 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f94c8d-bdff-499c-8e4f-1c5a022f328f" containerName="mariadb-database-create" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.946439 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f94c8d-bdff-499c-8e4f-1c5a022f328f" containerName="mariadb-database-create" Feb 17 17:02:09 crc kubenswrapper[4694]: E0217 17:02:09.946504 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a2e4cd-27d5-4b34-8b28-61db84dafc59" containerName="mariadb-account-create-update" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.946554 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a2e4cd-27d5-4b34-8b28-61db84dafc59" containerName="mariadb-account-create-update" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.946760 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f94c8d-bdff-499c-8e4f-1c5a022f328f" containerName="mariadb-database-create" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.946819 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a2e4cd-27d5-4b34-8b28-61db84dafc59" containerName="mariadb-account-create-update" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.949340 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.952296 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.952347 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fp2f" Feb 17 17:02:09 crc kubenswrapper[4694]: I0217 17:02:09.960451 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qkvkc"] Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.037943 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.038048 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2p77\" (UniqueName: \"kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.038096 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.038165 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.140026 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.140150 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.140243 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.140302 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2p77\" (UniqueName: \"kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.144968 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.145077 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.145266 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.158291 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2p77\" (UniqueName: \"kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77\") pod \"glance-db-sync-qkvkc\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.270229 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:10 crc kubenswrapper[4694]: I0217 17:02:10.923351 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qkvkc"] Feb 17 17:02:11 crc kubenswrapper[4694]: I0217 17:02:11.058361 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:11 crc kubenswrapper[4694]: E0217 17:02:11.058632 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:11 crc kubenswrapper[4694]: E0217 17:02:11.059509 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:11 crc kubenswrapper[4694]: E0217 17:02:11.059560 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:19.059544249 +0000 UTC m=+1206.816619573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:11 crc kubenswrapper[4694]: W0217 17:02:11.457342 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b3b5789_af72_49a1_8003_151eb88df06e.slice/crio-d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba WatchSource:0}: Error finding container d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba: Status 404 returned error can't find the container with id d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba Feb 17 17:02:11 crc kubenswrapper[4694]: I0217 17:02:11.638761 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qkvkc" event={"ID":"5b3b5789-af72-49a1-8003-151eb88df06e","Type":"ContainerStarted","Data":"d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba"} Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.443496 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.495254 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.495471 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="dnsmasq-dns" containerID="cri-o://be87d18c7c9fae6df4b7d7c5487b8b52c6b1fb47aae5c7dbc482fd06e9e7d1df" gracePeriod=10 Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.549936 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.654866 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wl8mg" event={"ID":"02c3f4af-5b32-4e7f-a562-6fd529a1abaf","Type":"ContainerDied","Data":"0bdc25861f9ec39c887b402e125b00357a679ed32beb2ff2fcab802b98d0e1a7"} Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.654907 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bdc25861f9ec39c887b402e125b00357a679ed32beb2ff2fcab802b98d0e1a7" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.655045 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wl8mg" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.657188 4694 generic.go:334] "Generic (PLEG): container finished" podID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerID="be87d18c7c9fae6df4b7d7c5487b8b52c6b1fb47aae5c7dbc482fd06e9e7d1df" exitCode=0 Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.657221 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" event={"ID":"ee6f4177-5e36-4e81-a1fe-ed0a715df304","Type":"ContainerDied","Data":"be87d18c7c9fae6df4b7d7c5487b8b52c6b1fb47aae5c7dbc482fd06e9e7d1df"} Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.690312 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts\") pod \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.690366 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn7p6\" (UniqueName: \"kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6\") pod \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\" (UID: \"02c3f4af-5b32-4e7f-a562-6fd529a1abaf\") " Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.691812 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02c3f4af-5b32-4e7f-a562-6fd529a1abaf" (UID: "02c3f4af-5b32-4e7f-a562-6fd529a1abaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.694500 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6" (OuterVolumeSpecName: "kube-api-access-mn7p6") pod "02c3f4af-5b32-4e7f-a562-6fd529a1abaf" (UID: "02c3f4af-5b32-4e7f-a562-6fd529a1abaf"). InnerVolumeSpecName "kube-api-access-mn7p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.794212 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.794240 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn7p6\" (UniqueName: \"kubernetes.io/projected/02c3f4af-5b32-4e7f-a562-6fd529a1abaf-kube-api-access-mn7p6\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:12 crc kubenswrapper[4694]: I0217 17:02:12.966939 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.102708 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm7rb\" (UniqueName: \"kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb\") pod \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.102934 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc\") pod \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.103038 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config\") pod \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\" (UID: \"ee6f4177-5e36-4e81-a1fe-ed0a715df304\") " Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.107290 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb" (OuterVolumeSpecName: "kube-api-access-zm7rb") pod "ee6f4177-5e36-4e81-a1fe-ed0a715df304" (UID: "ee6f4177-5e36-4e81-a1fe-ed0a715df304"). InnerVolumeSpecName "kube-api-access-zm7rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.141375 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config" (OuterVolumeSpecName: "config") pod "ee6f4177-5e36-4e81-a1fe-ed0a715df304" (UID: "ee6f4177-5e36-4e81-a1fe-ed0a715df304"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.141451 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee6f4177-5e36-4e81-a1fe-ed0a715df304" (UID: "ee6f4177-5e36-4e81-a1fe-ed0a715df304"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.204272 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.204574 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee6f4177-5e36-4e81-a1fe-ed0a715df304-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.204584 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm7rb\" (UniqueName: \"kubernetes.io/projected/ee6f4177-5e36-4e81-a1fe-ed0a715df304-kube-api-access-zm7rb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.671838 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q6jc9" event={"ID":"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3","Type":"ContainerStarted","Data":"6e5f4ed7169cfa3e148a870efd8136bb7bae7e4e866da53912bc3b7ef07a4db9"} Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.675210 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" event={"ID":"ee6f4177-5e36-4e81-a1fe-ed0a715df304","Type":"ContainerDied","Data":"d61cfcf41a4bb12dd7828fd537b0189011e350e3543c71e95f8025b25e327b7c"} Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.675227 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dxdcp" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.675262 4694 scope.go:117] "RemoveContainer" containerID="be87d18c7c9fae6df4b7d7c5487b8b52c6b1fb47aae5c7dbc482fd06e9e7d1df" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.695654 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-q6jc9" podStartSLOduration=2.198027623 podStartE2EDuration="6.695636441s" podCreationTimestamp="2026-02-17 17:02:07 +0000 UTC" firstStartedPulling="2026-02-17 17:02:07.87398383 +0000 UTC m=+1195.631059154" lastFinishedPulling="2026-02-17 17:02:12.371592648 +0000 UTC m=+1200.128667972" observedRunningTime="2026-02-17 17:02:13.691642593 +0000 UTC m=+1201.448717917" watchObservedRunningTime="2026-02-17 17:02:13.695636441 +0000 UTC m=+1201.452711765" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.711477 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.713950 4694 scope.go:117] "RemoveContainer" containerID="b2ffc52c4df72d0a6151f99c7a19941e9d273995bdeb1009b90ad65499132677" Feb 17 17:02:13 crc kubenswrapper[4694]: I0217 17:02:13.716789 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dxdcp"] Feb 17 17:02:14 crc kubenswrapper[4694]: I0217 17:02:14.913931 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" path="/var/lib/kubelet/pods/ee6f4177-5e36-4e81-a1fe-ed0a715df304/volumes" Feb 17 17:02:15 crc kubenswrapper[4694]: I0217 17:02:15.302580 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 17:02:17 crc kubenswrapper[4694]: I0217 17:02:17.983178 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wl8mg"] Feb 17 17:02:17 crc kubenswrapper[4694]: I0217 17:02:17.989958 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wl8mg"] Feb 17 17:02:18 crc kubenswrapper[4694]: E0217 17:02:18.747155 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:02:18 crc kubenswrapper[4694]: I0217 17:02:18.906486 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c3f4af-5b32-4e7f-a562-6fd529a1abaf" path="/var/lib/kubelet/pods/02c3f4af-5b32-4e7f-a562-6fd529a1abaf/volumes" Feb 17 17:02:19 crc kubenswrapper[4694]: I0217 17:02:19.107866 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:19 crc kubenswrapper[4694]: E0217 17:02:19.109167 4694 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 17:02:19 crc kubenswrapper[4694]: E0217 17:02:19.109188 4694 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 17:02:19 crc kubenswrapper[4694]: E0217 17:02:19.109229 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift podName:852493c7-97b4-4850-9ef3-44ec598d9d1a nodeName:}" failed. No retries permitted until 2026-02-17 17:02:35.109214462 +0000 UTC m=+1222.866289786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift") pod "swift-storage-0" (UID: "852493c7-97b4-4850-9ef3-44ec598d9d1a") : configmap "swift-ring-files" not found Feb 17 17:02:20 crc kubenswrapper[4694]: I0217 17:02:20.559443 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-stczv" podUID="45514b0e-57f3-494a-823a-2a0f0c2f728d" containerName="ovn-controller" probeResult="failure" output=< Feb 17 17:02:20 crc kubenswrapper[4694]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 17:02:20 crc kubenswrapper[4694]: > Feb 17 17:02:20 crc kubenswrapper[4694]: I0217 17:02:20.740912 4694 generic.go:334] "Generic (PLEG): container finished" podID="78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" containerID="6e5f4ed7169cfa3e148a870efd8136bb7bae7e4e866da53912bc3b7ef07a4db9" exitCode=0 Feb 17 17:02:20 crc kubenswrapper[4694]: I0217 17:02:20.740946 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q6jc9" event={"ID":"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3","Type":"ContainerDied","Data":"6e5f4ed7169cfa3e148a870efd8136bb7bae7e4e866da53912bc3b7ef07a4db9"} Feb 17 17:02:21 crc kubenswrapper[4694]: I0217 17:02:21.752446 4694 generic.go:334] "Generic (PLEG): container finished" podID="30f5bee5-cb28-4508-b091-35e85e299afa" containerID="f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e" exitCode=0 Feb 17 17:02:21 crc kubenswrapper[4694]: I0217 17:02:21.752516 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerDied","Data":"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e"} Feb 17 17:02:21 crc kubenswrapper[4694]: I0217 17:02:21.755205 4694 generic.go:334] "Generic (PLEG): container finished" podID="647b8309-483b-4f58-8360-202bb4b14824" containerID="3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053" exitCode=0 Feb 17 17:02:21 crc kubenswrapper[4694]: I0217 17:02:21.755421 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerDied","Data":"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053"} Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.990823 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mmbl9"] Feb 17 17:02:22 crc kubenswrapper[4694]: E0217 17:02:22.991654 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="init" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.991671 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="init" Feb 17 17:02:22 crc kubenswrapper[4694]: E0217 17:02:22.991681 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="dnsmasq-dns" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.991752 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="dnsmasq-dns" Feb 17 17:02:22 crc kubenswrapper[4694]: E0217 17:02:22.991767 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c3f4af-5b32-4e7f-a562-6fd529a1abaf" containerName="mariadb-account-create-update" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.991776 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c3f4af-5b32-4e7f-a562-6fd529a1abaf" containerName="mariadb-account-create-update" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.992194 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c3f4af-5b32-4e7f-a562-6fd529a1abaf" containerName="mariadb-account-create-update" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.992216 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6f4177-5e36-4e81-a1fe-ed0a715df304" containerName="dnsmasq-dns" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.992756 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.994870 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 17:02:22 crc kubenswrapper[4694]: I0217 17:02:22.998592 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmbl9"] Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.088270 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mfml\" (UniqueName: \"kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.088368 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.189938 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mfml\" (UniqueName: \"kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.190177 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.190920 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.210193 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mfml\" (UniqueName: \"kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml\") pod \"root-account-create-update-mmbl9\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.264730 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.293870 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.293945 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.293985 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.294022 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.294078 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.294148 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srkj\" (UniqueName: \"kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.294165 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle\") pod \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\" (UID: \"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3\") " Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.297054 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.297926 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.302266 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj" (OuterVolumeSpecName: "kube-api-access-2srkj") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "kube-api-access-2srkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.317058 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.317065 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts" (OuterVolumeSpecName: "scripts") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.317762 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.318918 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.327373 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" (UID: "78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395349 4694 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395688 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srkj\" (UniqueName: \"kubernetes.io/projected/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-kube-api-access-2srkj\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395713 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395728 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395740 4694 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395753 4694 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.395763 4694 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.768020 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mmbl9"] Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.770886 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q6jc9" event={"ID":"78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3","Type":"ContainerDied","Data":"3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448"} Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.770929 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3984162caae755763d8c2981117cf196104532d3d52ae0ff79ed20f06149c448" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.770899 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q6jc9" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.772439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerStarted","Data":"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e"} Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.772919 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.776785 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerStarted","Data":"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9"} Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.777048 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:02:23 crc kubenswrapper[4694]: W0217 17:02:23.785190 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bdf9bc5_0b05_4382_a139_24f51593f749.slice/crio-fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186 WatchSource:0}: Error finding container fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186: Status 404 returned error can't find the container with id fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186 Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.805387 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.268828227 podStartE2EDuration="59.805370754s" podCreationTimestamp="2026-02-17 17:01:24 +0000 UTC" firstStartedPulling="2026-02-17 17:01:40.798026868 +0000 UTC m=+1168.555102182" lastFinishedPulling="2026-02-17 17:01:47.334569385 +0000 UTC m=+1175.091644709" observedRunningTime="2026-02-17 17:02:23.804523623 +0000 UTC m=+1211.561598967" watchObservedRunningTime="2026-02-17 17:02:23.805370754 +0000 UTC m=+1211.562446078" Feb 17 17:02:23 crc kubenswrapper[4694]: I0217 17:02:23.836942 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.33791998 podStartE2EDuration="58.836923932s" podCreationTimestamp="2026-02-17 17:01:25 +0000 UTC" firstStartedPulling="2026-02-17 17:01:40.858354095 +0000 UTC m=+1168.615429419" lastFinishedPulling="2026-02-17 17:01:47.357358047 +0000 UTC m=+1175.114433371" observedRunningTime="2026-02-17 17:02:23.832831431 +0000 UTC m=+1211.589906775" watchObservedRunningTime="2026-02-17 17:02:23.836923932 +0000 UTC m=+1211.593999256" Feb 17 17:02:24 crc kubenswrapper[4694]: I0217 17:02:24.785579 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qkvkc" event={"ID":"5b3b5789-af72-49a1-8003-151eb88df06e","Type":"ContainerStarted","Data":"f1753018eaf4672a884f59e1b88afbc63d9da89e9860846e99a377180d8f762d"} Feb 17 17:02:24 crc kubenswrapper[4694]: I0217 17:02:24.787683 4694 generic.go:334] "Generic (PLEG): container finished" podID="3bdf9bc5-0b05-4382-a139-24f51593f749" containerID="7a1057d50737cbb1ff2f673544f86aedd1f0d217a56937acde54e48d67956437" exitCode=0 Feb 17 17:02:24 crc kubenswrapper[4694]: I0217 17:02:24.787754 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmbl9" event={"ID":"3bdf9bc5-0b05-4382-a139-24f51593f749","Type":"ContainerDied","Data":"7a1057d50737cbb1ff2f673544f86aedd1f0d217a56937acde54e48d67956437"} Feb 17 17:02:24 crc kubenswrapper[4694]: I0217 17:02:24.787825 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmbl9" event={"ID":"3bdf9bc5-0b05-4382-a139-24f51593f749","Type":"ContainerStarted","Data":"fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186"} Feb 17 17:02:24 crc kubenswrapper[4694]: I0217 17:02:24.832397 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qkvkc" podStartSLOduration=4.019634635 podStartE2EDuration="15.832381815s" podCreationTimestamp="2026-02-17 17:02:09 +0000 UTC" firstStartedPulling="2026-02-17 17:02:11.459426948 +0000 UTC m=+1199.216502282" lastFinishedPulling="2026-02-17 17:02:23.272174138 +0000 UTC m=+1211.029249462" observedRunningTime="2026-02-17 17:02:24.817839086 +0000 UTC m=+1212.574914410" watchObservedRunningTime="2026-02-17 17:02:24.832381815 +0000 UTC m=+1212.589457139" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.562772 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-stczv" podUID="45514b0e-57f3-494a-823a-2a0f0c2f728d" containerName="ovn-controller" probeResult="failure" output=< Feb 17 17:02:25 crc kubenswrapper[4694]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 17:02:25 crc kubenswrapper[4694]: > Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.631138 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.633802 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-q88vj" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.872542 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-stczv-config-ltljw"] Feb 17 17:02:25 crc kubenswrapper[4694]: E0217 17:02:25.876292 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" containerName="swift-ring-rebalance" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.876335 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" containerName="swift-ring-rebalance" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.876548 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3" containerName="swift-ring-rebalance" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.877187 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.879504 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 17:02:25 crc kubenswrapper[4694]: I0217 17:02:25.882380 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv-config-ltljw"] Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039327 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039444 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039481 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039522 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmz4\" (UniqueName: \"kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039574 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.039605 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141488 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141553 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141588 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llmz4\" (UniqueName: \"kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141647 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141670 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141741 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.141995 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.142035 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.142054 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.142483 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.144474 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.162426 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmz4\" (UniqueName: \"kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4\") pod \"ovn-controller-stczv-config-ltljw\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.201397 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.206711 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.344441 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mfml\" (UniqueName: \"kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml\") pod \"3bdf9bc5-0b05-4382-a139-24f51593f749\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.345191 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts\") pod \"3bdf9bc5-0b05-4382-a139-24f51593f749\" (UID: \"3bdf9bc5-0b05-4382-a139-24f51593f749\") " Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.345756 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bdf9bc5-0b05-4382-a139-24f51593f749" (UID: "3bdf9bc5-0b05-4382-a139-24f51593f749"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.349814 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml" (OuterVolumeSpecName: "kube-api-access-6mfml") pod "3bdf9bc5-0b05-4382-a139-24f51593f749" (UID: "3bdf9bc5-0b05-4382-a139-24f51593f749"). InnerVolumeSpecName "kube-api-access-6mfml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.447113 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bdf9bc5-0b05-4382-a139-24f51593f749-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.447139 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mfml\" (UniqueName: \"kubernetes.io/projected/3bdf9bc5-0b05-4382-a139-24f51593f749-kube-api-access-6mfml\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.665208 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv-config-ltljw"] Feb 17 17:02:26 crc kubenswrapper[4694]: W0217 17:02:26.673988 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b8cc15b_1c0c_4500_886d_fe4d5a346de7.slice/crio-62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08 WatchSource:0}: Error finding container 62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08: Status 404 returned error can't find the container with id 62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08 Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.807243 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mmbl9" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.807232 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mmbl9" event={"ID":"3bdf9bc5-0b05-4382-a139-24f51593f749","Type":"ContainerDied","Data":"fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186"} Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.807373 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb4b11ca15e286fffc5c86e6fe369aa28ab58549b95a17f0a4d0cb458bd60186" Feb 17 17:02:26 crc kubenswrapper[4694]: I0217 17:02:26.813214 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-ltljw" event={"ID":"9b8cc15b-1c0c-4500-886d-fe4d5a346de7","Type":"ContainerStarted","Data":"62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08"} Feb 17 17:02:27 crc kubenswrapper[4694]: I0217 17:02:27.821867 4694 generic.go:334] "Generic (PLEG): container finished" podID="9b8cc15b-1c0c-4500-886d-fe4d5a346de7" containerID="79457e6ee360e960dec5f3b489c05458b2915f47c1f9389813490d2a5c2da2f3" exitCode=0 Feb 17 17:02:27 crc kubenswrapper[4694]: I0217 17:02:27.821922 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-ltljw" event={"ID":"9b8cc15b-1c0c-4500-886d-fe4d5a346de7","Type":"ContainerDied","Data":"79457e6ee360e960dec5f3b489c05458b2915f47c1f9389813490d2a5c2da2f3"} Feb 17 17:02:28 crc kubenswrapper[4694]: E0217 17:02:28.939949 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.218809 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296486 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296537 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296685 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296744 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296785 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.296818 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llmz4\" (UniqueName: \"kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4\") pod \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\" (UID: \"9b8cc15b-1c0c-4500-886d-fe4d5a346de7\") " Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.298297 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run" (OuterVolumeSpecName: "var-run") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.298341 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.298469 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.298886 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.299216 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts" (OuterVolumeSpecName: "scripts") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.310484 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4" (OuterVolumeSpecName: "kube-api-access-llmz4") pod "9b8cc15b-1c0c-4500-886d-fe4d5a346de7" (UID: "9b8cc15b-1c0c-4500-886d-fe4d5a346de7"). InnerVolumeSpecName "kube-api-access-llmz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399201 4694 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399690 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399716 4694 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399730 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llmz4\" (UniqueName: \"kubernetes.io/projected/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-kube-api-access-llmz4\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399745 4694 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.399759 4694 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b8cc15b-1c0c-4500-886d-fe4d5a346de7-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.843241 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-ltljw" event={"ID":"9b8cc15b-1c0c-4500-886d-fe4d5a346de7","Type":"ContainerDied","Data":"62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08"} Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.843282 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62eb061f460946af7e5b57c03e4390dbfc3c6c0a5319d0694fe971ec1b5a1c08" Feb 17 17:02:29 crc kubenswrapper[4694]: I0217 17:02:29.843340 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-ltljw" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.304585 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-stczv-config-ltljw"] Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.309655 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-stczv-config-ltljw"] Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.412091 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-stczv-config-fjlpt"] Feb 17 17:02:30 crc kubenswrapper[4694]: E0217 17:02:30.412385 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8cc15b-1c0c-4500-886d-fe4d5a346de7" containerName="ovn-config" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.412395 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8cc15b-1c0c-4500-886d-fe4d5a346de7" containerName="ovn-config" Feb 17 17:02:30 crc kubenswrapper[4694]: E0217 17:02:30.412419 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdf9bc5-0b05-4382-a139-24f51593f749" containerName="mariadb-account-create-update" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.412447 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdf9bc5-0b05-4382-a139-24f51593f749" containerName="mariadb-account-create-update" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.412619 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8cc15b-1c0c-4500-886d-fe4d5a346de7" containerName="ovn-config" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.412641 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdf9bc5-0b05-4382-a139-24f51593f749" containerName="mariadb-account-create-update" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.413078 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.414602 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.433258 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv-config-fjlpt"] Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522245 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522321 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522408 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522451 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522488 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.522516 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wps\" (UniqueName: \"kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.557924 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-stczv" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.623993 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624066 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624103 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624126 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wps\" (UniqueName: \"kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624213 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624268 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624294 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624374 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.624666 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.625162 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.627173 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.649110 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wps\" (UniqueName: \"kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps\") pod \"ovn-controller-stczv-config-fjlpt\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.726668 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.864842 4694 generic.go:334] "Generic (PLEG): container finished" podID="5b3b5789-af72-49a1-8003-151eb88df06e" containerID="f1753018eaf4672a884f59e1b88afbc63d9da89e9860846e99a377180d8f762d" exitCode=0 Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.864893 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qkvkc" event={"ID":"5b3b5789-af72-49a1-8003-151eb88df06e","Type":"ContainerDied","Data":"f1753018eaf4672a884f59e1b88afbc63d9da89e9860846e99a377180d8f762d"} Feb 17 17:02:30 crc kubenswrapper[4694]: I0217 17:02:30.907678 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8cc15b-1c0c-4500-886d-fe4d5a346de7" path="/var/lib/kubelet/pods/9b8cc15b-1c0c-4500-886d-fe4d5a346de7/volumes" Feb 17 17:02:31 crc kubenswrapper[4694]: I0217 17:02:31.097945 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-stczv-config-fjlpt"] Feb 17 17:02:31 crc kubenswrapper[4694]: I0217 17:02:31.873380 4694 generic.go:334] "Generic (PLEG): container finished" podID="c825697b-f40a-474c-b2c1-67b8ab1eab37" containerID="6ab0ff5e18e690db7ef239db82f91bc6913210fb326bcf714effdd8d616ad7b4" exitCode=0 Feb 17 17:02:31 crc kubenswrapper[4694]: I0217 17:02:31.873461 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-fjlpt" event={"ID":"c825697b-f40a-474c-b2c1-67b8ab1eab37","Type":"ContainerDied","Data":"6ab0ff5e18e690db7ef239db82f91bc6913210fb326bcf714effdd8d616ad7b4"} Feb 17 17:02:31 crc kubenswrapper[4694]: I0217 17:02:31.873491 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-fjlpt" event={"ID":"c825697b-f40a-474c-b2c1-67b8ab1eab37","Type":"ContainerStarted","Data":"bf7b08fb1ed25c2e742bb10599cfd398aa9040c16cc1a34c644a7d05786a9f18"} Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.227104 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.362332 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2p77\" (UniqueName: \"kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77\") pod \"5b3b5789-af72-49a1-8003-151eb88df06e\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.362385 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle\") pod \"5b3b5789-af72-49a1-8003-151eb88df06e\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.362416 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data\") pod \"5b3b5789-af72-49a1-8003-151eb88df06e\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.362453 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data\") pod \"5b3b5789-af72-49a1-8003-151eb88df06e\" (UID: \"5b3b5789-af72-49a1-8003-151eb88df06e\") " Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.367463 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5b3b5789-af72-49a1-8003-151eb88df06e" (UID: "5b3b5789-af72-49a1-8003-151eb88df06e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.367780 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77" (OuterVolumeSpecName: "kube-api-access-j2p77") pod "5b3b5789-af72-49a1-8003-151eb88df06e" (UID: "5b3b5789-af72-49a1-8003-151eb88df06e"). InnerVolumeSpecName "kube-api-access-j2p77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.389832 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b3b5789-af72-49a1-8003-151eb88df06e" (UID: "5b3b5789-af72-49a1-8003-151eb88df06e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.411339 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data" (OuterVolumeSpecName: "config-data") pod "5b3b5789-af72-49a1-8003-151eb88df06e" (UID: "5b3b5789-af72-49a1-8003-151eb88df06e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.464533 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2p77\" (UniqueName: \"kubernetes.io/projected/5b3b5789-af72-49a1-8003-151eb88df06e-kube-api-access-j2p77\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.464570 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.464580 4694 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.464590 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b3b5789-af72-49a1-8003-151eb88df06e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.883632 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qkvkc" event={"ID":"5b3b5789-af72-49a1-8003-151eb88df06e","Type":"ContainerDied","Data":"d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba"} Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.883925 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d794f2cf28c7110da79cce58d46260c86efd47f6df5777ce26bca0ae4a0276ba" Feb 17 17:02:32 crc kubenswrapper[4694]: I0217 17:02:32.883657 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qkvkc" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.204122 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.286618 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:02:33 crc kubenswrapper[4694]: E0217 17:02:33.286927 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c825697b-f40a-474c-b2c1-67b8ab1eab37" containerName="ovn-config" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.286944 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c825697b-f40a-474c-b2c1-67b8ab1eab37" containerName="ovn-config" Feb 17 17:02:33 crc kubenswrapper[4694]: E0217 17:02:33.286965 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3b5789-af72-49a1-8003-151eb88df06e" containerName="glance-db-sync" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.286970 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3b5789-af72-49a1-8003-151eb88df06e" containerName="glance-db-sync" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.287110 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="c825697b-f40a-474c-b2c1-67b8ab1eab37" containerName="ovn-config" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.287126 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3b5789-af72-49a1-8003-151eb88df06e" containerName="glance-db-sync" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.288071 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.300336 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.377743 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.377868 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run" (OuterVolumeSpecName: "var-run") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.377875 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.377959 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wps\" (UniqueName: \"kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.377994 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378137 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378172 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378290 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn\") pod \"c825697b-f40a-474c-b2c1-67b8ab1eab37\" (UID: \"c825697b-f40a-474c-b2c1-67b8ab1eab37\") " Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378359 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378516 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378673 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7t2\" (UniqueName: \"kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378715 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378742 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378771 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378847 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378920 4694 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378922 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts" (OuterVolumeSpecName: "scripts") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378942 4694 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378979 4694 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.378997 4694 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c825697b-f40a-474c-b2c1-67b8ab1eab37-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.385091 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps" (OuterVolumeSpecName: "kube-api-access-g4wps") pod "c825697b-f40a-474c-b2c1-67b8ab1eab37" (UID: "c825697b-f40a-474c-b2c1-67b8ab1eab37"). InnerVolumeSpecName "kube-api-access-g4wps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480696 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7t2\" (UniqueName: \"kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480755 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480782 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480812 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480881 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.480986 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wps\" (UniqueName: \"kubernetes.io/projected/c825697b-f40a-474c-b2c1-67b8ab1eab37-kube-api-access-g4wps\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.481004 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c825697b-f40a-474c-b2c1-67b8ab1eab37-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.481872 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.481925 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.482053 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.482072 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.498453 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7t2\" (UniqueName: \"kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2\") pod \"dnsmasq-dns-74dc88fc-fvgsp\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.606550 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.892246 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-stczv-config-fjlpt" event={"ID":"c825697b-f40a-474c-b2c1-67b8ab1eab37","Type":"ContainerDied","Data":"bf7b08fb1ed25c2e742bb10599cfd398aa9040c16cc1a34c644a7d05786a9f18"} Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.893094 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf7b08fb1ed25c2e742bb10599cfd398aa9040c16cc1a34c644a7d05786a9f18" Feb 17 17:02:33 crc kubenswrapper[4694]: I0217 17:02:33.892311 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-stczv-config-fjlpt" Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.183101 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.295063 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-stczv-config-fjlpt"] Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.303737 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-stczv-config-fjlpt"] Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.902871 4694 generic.go:334] "Generic (PLEG): container finished" podID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerID="27c874d79f7fba7f03ba50525595042c3b87ec5d6b0d41f80e8d8082746a69fe" exitCode=0 Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.904641 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c825697b-f40a-474c-b2c1-67b8ab1eab37" path="/var/lib/kubelet/pods/c825697b-f40a-474c-b2c1-67b8ab1eab37/volumes" Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.905367 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" event={"ID":"6bc3eb44-a604-462b-8a1d-9bb52e39a18b","Type":"ContainerDied","Data":"27c874d79f7fba7f03ba50525595042c3b87ec5d6b0d41f80e8d8082746a69fe"} Feb 17 17:02:34 crc kubenswrapper[4694]: I0217 17:02:34.905402 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" event={"ID":"6bc3eb44-a604-462b-8a1d-9bb52e39a18b","Type":"ContainerStarted","Data":"b003ecd5084315d3280d458884c84d8da0943133b450c3b76fc702fc35539aa5"} Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.209889 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.216628 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/852493c7-97b4-4850-9ef3-44ec598d9d1a-etc-swift\") pod \"swift-storage-0\" (UID: \"852493c7-97b4-4850-9ef3-44ec598d9d1a\") " pod="openstack/swift-storage-0" Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.352312 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.864020 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 17:02:35 crc kubenswrapper[4694]: W0217 17:02:35.867637 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852493c7_97b4_4850_9ef3_44ec598d9d1a.slice/crio-a1ff51c0c72a8e796067c137f1bf84b48844a02da7ae29494d7beadd03f9b1ca WatchSource:0}: Error finding container a1ff51c0c72a8e796067c137f1bf84b48844a02da7ae29494d7beadd03f9b1ca: Status 404 returned error can't find the container with id a1ff51c0c72a8e796067c137f1bf84b48844a02da7ae29494d7beadd03f9b1ca Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.910897 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"a1ff51c0c72a8e796067c137f1bf84b48844a02da7ae29494d7beadd03f9b1ca"} Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.914204 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" event={"ID":"6bc3eb44-a604-462b-8a1d-9bb52e39a18b","Type":"ContainerStarted","Data":"eacc3fdd7029d84917ca9abe1410eeb4a035073b8fb503a5e13127a7402d05fd"} Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.914365 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:35 crc kubenswrapper[4694]: I0217 17:02:35.937288 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" podStartSLOduration=2.9372666929999998 podStartE2EDuration="2.937266693s" podCreationTimestamp="2026-02-17 17:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:35.930485876 +0000 UTC m=+1223.687561210" watchObservedRunningTime="2026-02-17 17:02:35.937266693 +0000 UTC m=+1223.694342017" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.279858 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.650828 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.659186 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bxnn6"] Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.660170 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.673855 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bxnn6"] Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.732865 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.732983 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pbl4\" (UniqueName: \"kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.803939 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7ac7-account-create-update-nd26q"] Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.804843 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.821521 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ac7-account-create-update-nd26q"] Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.830228 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.834623 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pbl4\" (UniqueName: \"kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.834799 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.835622 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.879286 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pbl4\" (UniqueName: \"kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4\") pod \"cinder-db-create-bxnn6\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.935821 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.935876 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhfzj\" (UniqueName: \"kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.971435 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l2cxb"] Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.972664 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.982530 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:36 crc kubenswrapper[4694]: I0217 17:02:36.991510 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l2cxb"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.039780 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.039886 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgvl7\" (UniqueName: \"kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.039974 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.039996 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhfzj\" (UniqueName: \"kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.041371 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.071350 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhfzj\" (UniqueName: \"kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj\") pod \"cinder-7ac7-account-create-update-nd26q\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.075700 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hgsbq"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.076725 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.089423 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hgsbq"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.102082 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0b44-account-create-update-bhwxj"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.103462 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.106859 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.117363 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0b44-account-create-update-bhwxj"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.124670 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.143659 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rddtp\" (UniqueName: \"kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.143764 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.143797 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgvl7\" (UniqueName: \"kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.143840 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.144725 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.175394 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgvl7\" (UniqueName: \"kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7\") pod \"barbican-db-create-l2cxb\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.180629 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-w7fks"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.204337 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.215250 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z26nq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.215453 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.215627 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.215797 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.234828 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w7fks"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246379 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246438 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246512 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246538 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ldp\" (UniqueName: \"kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246554 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246578 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rddtp\" (UniqueName: \"kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.246623 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98bh\" (UniqueName: \"kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.247240 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.296280 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.304210 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rddtp\" (UniqueName: \"kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp\") pod \"neutron-db-create-hgsbq\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.314307 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2194-account-create-update-52gm2"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.315459 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.319508 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.322071 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2194-account-create-update-52gm2"] Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.350902 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.350973 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfv8\" (UniqueName: \"kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.351076 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ldp\" (UniqueName: \"kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.351105 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.351159 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m98bh\" (UniqueName: \"kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.351225 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.351253 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.352759 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.359629 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.372698 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.374910 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98bh\" (UniqueName: \"kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh\") pod \"barbican-0b44-account-create-update-bhwxj\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.375023 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ldp\" (UniqueName: \"kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp\") pod \"keystone-db-sync-w7fks\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.431457 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.452349 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.452418 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfv8\" (UniqueName: \"kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.453577 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.457815 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.470204 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfv8\" (UniqueName: \"kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8\") pod \"neutron-2194-account-create-update-52gm2\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.539767 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.640191 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.694379 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bxnn6"] Feb 17 17:02:37 crc kubenswrapper[4694]: W0217 17:02:37.754596 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda42d3f8d_87a8_4499_a06e_4c0bd452ba66.slice/crio-0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022 WatchSource:0}: Error finding container 0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022: Status 404 returned error can't find the container with id 0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022 Feb 17 17:02:37 crc kubenswrapper[4694]: I0217 17:02:37.961215 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bxnn6" event={"ID":"a42d3f8d-87a8-4499-a06e-4c0bd452ba66","Type":"ContainerStarted","Data":"0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022"} Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.349205 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l2cxb"] Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.363948 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2194-account-create-update-52gm2"] Feb 17 17:02:38 crc kubenswrapper[4694]: W0217 17:02:38.365828 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e7502f1_dba9_43c8_81b3_8516714bca75.slice/crio-d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753 WatchSource:0}: Error finding container d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753: Status 404 returned error can't find the container with id d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753 Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.373860 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7ac7-account-create-update-nd26q"] Feb 17 17:02:38 crc kubenswrapper[4694]: W0217 17:02:38.407215 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ebd2d40_ad72_4e4d_9264_890f92641e9d.slice/crio-92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865 WatchSource:0}: Error finding container 92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865: Status 404 returned error can't find the container with id 92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865 Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.552948 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0b44-account-create-update-bhwxj"] Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.572242 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hgsbq"] Feb 17 17:02:38 crc kubenswrapper[4694]: W0217 17:02:38.583902 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01db982_56b8_4f3c_98f8_c21954640fce.slice/crio-03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203 WatchSource:0}: Error finding container 03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203: Status 404 returned error can't find the container with id 03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203 Feb 17 17:02:38 crc kubenswrapper[4694]: W0217 17:02:38.584183 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04936e25_daf1_4d3a_8256_8e1c127688cb.slice/crio-5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6 WatchSource:0}: Error finding container 5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6: Status 404 returned error can't find the container with id 5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6 Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.741347 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w7fks"] Feb 17 17:02:38 crc kubenswrapper[4694]: W0217 17:02:38.757338 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4782973d_718d_4b2a_9b1e_84dfcfbafced.slice/crio-0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42 WatchSource:0}: Error finding container 0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42: Status 404 returned error can't find the container with id 0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42 Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.985943 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ac7-account-create-update-nd26q" event={"ID":"3ebd2d40-ad72-4e4d-9264-890f92641e9d","Type":"ContainerStarted","Data":"61643b7fad49604a99a262c89b7cc43143f1d16a9f3497c3153e275b875a7902"} Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.986343 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ac7-account-create-update-nd26q" event={"ID":"3ebd2d40-ad72-4e4d-9264-890f92641e9d","Type":"ContainerStarted","Data":"92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865"} Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.989047 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w7fks" event={"ID":"4782973d-718d-4b2a-9b1e-84dfcfbafced","Type":"ContainerStarted","Data":"0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42"} Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.992494 4694 generic.go:334] "Generic (PLEG): container finished" podID="2483097f-291b-41d8-9428-6bca4956ae91" containerID="bb6bcb4b5ef7af6e3a457dfc6b56d879f42e873b5acf3a9dced8cf7bad135dc8" exitCode=0 Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.992564 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l2cxb" event={"ID":"2483097f-291b-41d8-9428-6bca4956ae91","Type":"ContainerDied","Data":"bb6bcb4b5ef7af6e3a457dfc6b56d879f42e873b5acf3a9dced8cf7bad135dc8"} Feb 17 17:02:38 crc kubenswrapper[4694]: I0217 17:02:38.992592 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l2cxb" event={"ID":"2483097f-291b-41d8-9428-6bca4956ae91","Type":"ContainerStarted","Data":"1aa6eb427a56916a6d191f381e6930bed3f7269639bbff2b6e47b07c593d6f5c"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.005194 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2194-account-create-update-52gm2" event={"ID":"7e7502f1-dba9-43c8-81b3-8516714bca75","Type":"ContainerStarted","Data":"bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.005234 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2194-account-create-update-52gm2" event={"ID":"7e7502f1-dba9-43c8-81b3-8516714bca75","Type":"ContainerStarted","Data":"d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.012307 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7ac7-account-create-update-nd26q" podStartSLOduration=3.012263927 podStartE2EDuration="3.012263927s" podCreationTimestamp="2026-02-17 17:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:39.004036024 +0000 UTC m=+1226.761111358" watchObservedRunningTime="2026-02-17 17:02:39.012263927 +0000 UTC m=+1226.769339251" Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.013495 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hgsbq" event={"ID":"b01db982-56b8-4f3c-98f8-c21954640fce","Type":"ContainerStarted","Data":"8f91331c2c2a4ce787273b6090b2a2ad65306d509a6a3576b9799dff4ecab789"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.013542 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hgsbq" event={"ID":"b01db982-56b8-4f3c-98f8-c21954640fce","Type":"ContainerStarted","Data":"03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.021736 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0b44-account-create-update-bhwxj" event={"ID":"04936e25-daf1-4d3a-8256-8e1c127688cb","Type":"ContainerStarted","Data":"27a0c7908fab8f0da115dfbefa8e382675b0af1d4083cc91b4ac077713e22a89"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.021790 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0b44-account-create-update-bhwxj" event={"ID":"04936e25-daf1-4d3a-8256-8e1c127688cb","Type":"ContainerStarted","Data":"5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.026335 4694 generic.go:334] "Generic (PLEG): container finished" podID="a42d3f8d-87a8-4499-a06e-4c0bd452ba66" containerID="af629601be94a0118df68580f2a3373bbec2327d97ae533f2f2b542bd9b3c151" exitCode=0 Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.026393 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bxnn6" event={"ID":"a42d3f8d-87a8-4499-a06e-4c0bd452ba66","Type":"ContainerDied","Data":"af629601be94a0118df68580f2a3373bbec2327d97ae533f2f2b542bd9b3c151"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.048722 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"23cf84806a4b126123342670f8f8884b1c85999e669eb82830ee3404b638e104"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.049004 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"c58a043a9f82d9ba81f9f1666f98012597c26d3563dfa0f7d6e6bebd7e338a32"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.049089 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"bdc873355dd40cd16e9fbf5e324150b918466ca89f020abcc19bc874024dbea7"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.049176 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"c7bc1911e6a9c869bfdf2fcf54daf1c1319cbc4d0770cd68479d41e1210aa228"} Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.143439 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-hgsbq" podStartSLOduration=2.14342224 podStartE2EDuration="2.14342224s" podCreationTimestamp="2026-02-17 17:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:39.132908221 +0000 UTC m=+1226.889983545" watchObservedRunningTime="2026-02-17 17:02:39.14342224 +0000 UTC m=+1226.900497564" Feb 17 17:02:39 crc kubenswrapper[4694]: I0217 17:02:39.179692 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0b44-account-create-update-bhwxj" podStartSLOduration=2.179671334 podStartE2EDuration="2.179671334s" podCreationTimestamp="2026-02-17 17:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:39.147000949 +0000 UTC m=+1226.904076273" watchObservedRunningTime="2026-02-17 17:02:39.179671334 +0000 UTC m=+1226.936746658" Feb 17 17:02:39 crc kubenswrapper[4694]: E0217 17:02:39.211697 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e7502f1_dba9_43c8_81b3_8516714bca75.slice/crio-bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e7502f1_dba9_43c8_81b3_8516714bca75.slice/crio-conmon-bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.062674 4694 generic.go:334] "Generic (PLEG): container finished" podID="04936e25-daf1-4d3a-8256-8e1c127688cb" containerID="27a0c7908fab8f0da115dfbefa8e382675b0af1d4083cc91b4ac077713e22a89" exitCode=0 Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.063134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0b44-account-create-update-bhwxj" event={"ID":"04936e25-daf1-4d3a-8256-8e1c127688cb","Type":"ContainerDied","Data":"27a0c7908fab8f0da115dfbefa8e382675b0af1d4083cc91b4ac077713e22a89"} Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.065635 4694 generic.go:334] "Generic (PLEG): container finished" podID="3ebd2d40-ad72-4e4d-9264-890f92641e9d" containerID="61643b7fad49604a99a262c89b7cc43143f1d16a9f3497c3153e275b875a7902" exitCode=0 Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.065737 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ac7-account-create-update-nd26q" event={"ID":"3ebd2d40-ad72-4e4d-9264-890f92641e9d","Type":"ContainerDied","Data":"61643b7fad49604a99a262c89b7cc43143f1d16a9f3497c3153e275b875a7902"} Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.066877 4694 generic.go:334] "Generic (PLEG): container finished" podID="7e7502f1-dba9-43c8-81b3-8516714bca75" containerID="bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6" exitCode=0 Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.066926 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2194-account-create-update-52gm2" event={"ID":"7e7502f1-dba9-43c8-81b3-8516714bca75","Type":"ContainerDied","Data":"bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6"} Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.070717 4694 generic.go:334] "Generic (PLEG): container finished" podID="b01db982-56b8-4f3c-98f8-c21954640fce" containerID="8f91331c2c2a4ce787273b6090b2a2ad65306d509a6a3576b9799dff4ecab789" exitCode=0 Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.071262 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hgsbq" event={"ID":"b01db982-56b8-4f3c-98f8-c21954640fce","Type":"ContainerDied","Data":"8f91331c2c2a4ce787273b6090b2a2ad65306d509a6a3576b9799dff4ecab789"} Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.469320 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.520522 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfv8\" (UniqueName: \"kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8\") pod \"7e7502f1-dba9-43c8-81b3-8516714bca75\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.520593 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts\") pod \"7e7502f1-dba9-43c8-81b3-8516714bca75\" (UID: \"7e7502f1-dba9-43c8-81b3-8516714bca75\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.521798 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e7502f1-dba9-43c8-81b3-8516714bca75" (UID: "7e7502f1-dba9-43c8-81b3-8516714bca75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.524531 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8" (OuterVolumeSpecName: "kube-api-access-vhfv8") pod "7e7502f1-dba9-43c8-81b3-8516714bca75" (UID: "7e7502f1-dba9-43c8-81b3-8516714bca75"). InnerVolumeSpecName "kube-api-access-vhfv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.628826 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfv8\" (UniqueName: \"kubernetes.io/projected/7e7502f1-dba9-43c8-81b3-8516714bca75-kube-api-access-vhfv8\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.629111 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e7502f1-dba9-43c8-81b3-8516714bca75-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.654734 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.656029 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.730019 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts\") pod \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.730248 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgvl7\" (UniqueName: \"kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7\") pod \"2483097f-291b-41d8-9428-6bca4956ae91\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.730496 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pbl4\" (UniqueName: \"kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4\") pod \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\" (UID: \"a42d3f8d-87a8-4499-a06e-4c0bd452ba66\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.730517 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a42d3f8d-87a8-4499-a06e-4c0bd452ba66" (UID: "a42d3f8d-87a8-4499-a06e-4c0bd452ba66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.730751 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts\") pod \"2483097f-291b-41d8-9428-6bca4956ae91\" (UID: \"2483097f-291b-41d8-9428-6bca4956ae91\") " Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.731263 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2483097f-291b-41d8-9428-6bca4956ae91" (UID: "2483097f-291b-41d8-9428-6bca4956ae91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.731137 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.733845 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4" (OuterVolumeSpecName: "kube-api-access-8pbl4") pod "a42d3f8d-87a8-4499-a06e-4c0bd452ba66" (UID: "a42d3f8d-87a8-4499-a06e-4c0bd452ba66"). InnerVolumeSpecName "kube-api-access-8pbl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.734649 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7" (OuterVolumeSpecName: "kube-api-access-zgvl7") pod "2483097f-291b-41d8-9428-6bca4956ae91" (UID: "2483097f-291b-41d8-9428-6bca4956ae91"). InnerVolumeSpecName "kube-api-access-zgvl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.833853 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgvl7\" (UniqueName: \"kubernetes.io/projected/2483097f-291b-41d8-9428-6bca4956ae91-kube-api-access-zgvl7\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.833879 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pbl4\" (UniqueName: \"kubernetes.io/projected/a42d3f8d-87a8-4499-a06e-4c0bd452ba66-kube-api-access-8pbl4\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:40 crc kubenswrapper[4694]: I0217 17:02:40.833889 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2483097f-291b-41d8-9428-6bca4956ae91-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.089525 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bxnn6" event={"ID":"a42d3f8d-87a8-4499-a06e-4c0bd452ba66","Type":"ContainerDied","Data":"0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.089560 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cfcf765a64aceaa218d4b03b6ba7059d0af8b57116c2e1030067b6f1a764022" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.089571 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bxnn6" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.096411 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"c062cb4861004abe88a01504d81bf3684b866cc77bb6c276d0b4745dea2962ed"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.096458 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"03652686a946f44a82b9f24d97b905e425b5dd2706cc083d652cdf8445d74204"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.096471 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"8b34d884bff32337a0732a83611b6a06314f27604769fc2a0509f6c67dd609b0"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.096482 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"2076e08a8a12570b66d6e8b2526bcafbf7d66986e5f4feb92fd1b38f077e8b17"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.100887 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l2cxb" event={"ID":"2483097f-291b-41d8-9428-6bca4956ae91","Type":"ContainerDied","Data":"1aa6eb427a56916a6d191f381e6930bed3f7269639bbff2b6e47b07c593d6f5c"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.100930 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aa6eb427a56916a6d191f381e6930bed3f7269639bbff2b6e47b07c593d6f5c" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.100899 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l2cxb" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.102242 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2194-account-create-update-52gm2" event={"ID":"7e7502f1-dba9-43c8-81b3-8516714bca75","Type":"ContainerDied","Data":"d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753"} Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.102290 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d126631403b4c6720b60a7038a76acf4e25bcf1e1a7476eb23e47df309816753" Feb 17 17:02:41 crc kubenswrapper[4694]: I0217 17:02:41.102339 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2194-account-create-update-52gm2" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.607838 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.674506 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.674852 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="dnsmasq-dns" containerID="cri-o://9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790" gracePeriod=10 Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.949322 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.956552 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.983457 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.985579 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rddtp\" (UniqueName: \"kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp\") pod \"b01db982-56b8-4f3c-98f8-c21954640fce\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.985687 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts\") pod \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.985758 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhfzj\" (UniqueName: \"kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj\") pod \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\" (UID: \"3ebd2d40-ad72-4e4d-9264-890f92641e9d\") " Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.985780 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts\") pod \"b01db982-56b8-4f3c-98f8-c21954640fce\" (UID: \"b01db982-56b8-4f3c-98f8-c21954640fce\") " Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.987064 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b01db982-56b8-4f3c-98f8-c21954640fce" (UID: "b01db982-56b8-4f3c-98f8-c21954640fce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.987986 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ebd2d40-ad72-4e4d-9264-890f92641e9d" (UID: "3ebd2d40-ad72-4e4d-9264-890f92641e9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:43 crc kubenswrapper[4694]: I0217 17:02:43.991569 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp" (OuterVolumeSpecName: "kube-api-access-rddtp") pod "b01db982-56b8-4f3c-98f8-c21954640fce" (UID: "b01db982-56b8-4f3c-98f8-c21954640fce"). InnerVolumeSpecName "kube-api-access-rddtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.008183 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj" (OuterVolumeSpecName: "kube-api-access-lhfzj") pod "3ebd2d40-ad72-4e4d-9264-890f92641e9d" (UID: "3ebd2d40-ad72-4e4d-9264-890f92641e9d"). InnerVolumeSpecName "kube-api-access-lhfzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.087458 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts\") pod \"04936e25-daf1-4d3a-8256-8e1c127688cb\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.087712 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m98bh\" (UniqueName: \"kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh\") pod \"04936e25-daf1-4d3a-8256-8e1c127688cb\" (UID: \"04936e25-daf1-4d3a-8256-8e1c127688cb\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.088327 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ebd2d40-ad72-4e4d-9264-890f92641e9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.088351 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhfzj\" (UniqueName: \"kubernetes.io/projected/3ebd2d40-ad72-4e4d-9264-890f92641e9d-kube-api-access-lhfzj\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.088364 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b01db982-56b8-4f3c-98f8-c21954640fce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.088378 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rddtp\" (UniqueName: \"kubernetes.io/projected/b01db982-56b8-4f3c-98f8-c21954640fce-kube-api-access-rddtp\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.088812 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.089180 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04936e25-daf1-4d3a-8256-8e1c127688cb" (UID: "04936e25-daf1-4d3a-8256-8e1c127688cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.094460 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh" (OuterVolumeSpecName: "kube-api-access-m98bh") pod "04936e25-daf1-4d3a-8256-8e1c127688cb" (UID: "04936e25-daf1-4d3a-8256-8e1c127688cb"). InnerVolumeSpecName "kube-api-access-m98bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.157424 4694 generic.go:334] "Generic (PLEG): container finished" podID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerID="9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790" exitCode=0 Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.157530 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" event={"ID":"2cd132a1-e4c4-4588-a436-daa27b4a1a98","Type":"ContainerDied","Data":"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.157563 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" event={"ID":"2cd132a1-e4c4-4588-a436-daa27b4a1a98","Type":"ContainerDied","Data":"4ac4702334ddd7f77bc6f5340f0a8f296d465bdee469dca069110bc2a8cee71b"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.157582 4694 scope.go:117] "RemoveContainer" containerID="9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.157753 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8lqnf" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.177183 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hgsbq" event={"ID":"b01db982-56b8-4f3c-98f8-c21954640fce","Type":"ContainerDied","Data":"03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.177246 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03aa8dc30f302ec3f5b32e20ede0eb7d1b62a834e6093c5283d5ad313f594203" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.177329 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hgsbq" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190196 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb\") pod \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190260 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb\") pod \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190311 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config\") pod \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190368 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmdtj\" (UniqueName: \"kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj\") pod \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190444 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc\") pod \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\" (UID: \"2cd132a1-e4c4-4588-a436-daa27b4a1a98\") " Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190920 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04936e25-daf1-4d3a-8256-8e1c127688cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.190945 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m98bh\" (UniqueName: \"kubernetes.io/projected/04936e25-daf1-4d3a-8256-8e1c127688cb-kube-api-access-m98bh\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.194898 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0b44-account-create-update-bhwxj" event={"ID":"04936e25-daf1-4d3a-8256-8e1c127688cb","Type":"ContainerDied","Data":"5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.194943 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eacaf56179c005b0d363de7a933576de4bc7b60c8d94a2fdac2e44ea493c0b6" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.195027 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0b44-account-create-update-bhwxj" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.214265 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7ac7-account-create-update-nd26q" event={"ID":"3ebd2d40-ad72-4e4d-9264-890f92641e9d","Type":"ContainerDied","Data":"92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.214333 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d72c339b385660da860ad30a17a8651b5163eb17109d8819123ffa7499e865" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.214423 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7ac7-account-create-update-nd26q" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.220210 4694 scope.go:117] "RemoveContainer" containerID="54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.220595 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj" (OuterVolumeSpecName: "kube-api-access-hmdtj") pod "2cd132a1-e4c4-4588-a436-daa27b4a1a98" (UID: "2cd132a1-e4c4-4588-a436-daa27b4a1a98"). InnerVolumeSpecName "kube-api-access-hmdtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.237390 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w7fks" event={"ID":"4782973d-718d-4b2a-9b1e-84dfcfbafced","Type":"ContainerStarted","Data":"c5fcd645551e1c3cdb5d945297b25cba32fce983fcf514f2bd71f5eaff0f4d21"} Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.276088 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-w7fks" podStartSLOduration=2.249747412 podStartE2EDuration="7.276052415s" podCreationTimestamp="2026-02-17 17:02:37 +0000 UTC" firstStartedPulling="2026-02-17 17:02:38.762341995 +0000 UTC m=+1226.519417319" lastFinishedPulling="2026-02-17 17:02:43.788646998 +0000 UTC m=+1231.545722322" observedRunningTime="2026-02-17 17:02:44.258718167 +0000 UTC m=+1232.015793491" watchObservedRunningTime="2026-02-17 17:02:44.276052415 +0000 UTC m=+1232.033127739" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.292086 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmdtj\" (UniqueName: \"kubernetes.io/projected/2cd132a1-e4c4-4588-a436-daa27b4a1a98-kube-api-access-hmdtj\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.298965 4694 scope.go:117] "RemoveContainer" containerID="9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790" Feb 17 17:02:44 crc kubenswrapper[4694]: E0217 17:02:44.302372 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790\": container with ID starting with 9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790 not found: ID does not exist" containerID="9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.302425 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790"} err="failed to get container status \"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790\": rpc error: code = NotFound desc = could not find container \"9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790\": container with ID starting with 9a4456c28850bcf752928b49b98bc899a5f88f32b3f4c6b6f76226eca1c95790 not found: ID does not exist" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.302448 4694 scope.go:117] "RemoveContainer" containerID="54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc" Feb 17 17:02:44 crc kubenswrapper[4694]: E0217 17:02:44.303373 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc\": container with ID starting with 54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc not found: ID does not exist" containerID="54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.303426 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc"} err="failed to get container status \"54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc\": rpc error: code = NotFound desc = could not find container \"54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc\": container with ID starting with 54b94a076489ee5d1dbebb5dd65659edb08804cad9af1a859a3783a032f8c9dc not found: ID does not exist" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.311307 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config" (OuterVolumeSpecName: "config") pod "2cd132a1-e4c4-4588-a436-daa27b4a1a98" (UID: "2cd132a1-e4c4-4588-a436-daa27b4a1a98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.315180 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2cd132a1-e4c4-4588-a436-daa27b4a1a98" (UID: "2cd132a1-e4c4-4588-a436-daa27b4a1a98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.324787 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2cd132a1-e4c4-4588-a436-daa27b4a1a98" (UID: "2cd132a1-e4c4-4588-a436-daa27b4a1a98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.329964 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2cd132a1-e4c4-4588-a436-daa27b4a1a98" (UID: "2cd132a1-e4c4-4588-a436-daa27b4a1a98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.393994 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.394027 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.394038 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.394046 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cd132a1-e4c4-4588-a436-daa27b4a1a98-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.499667 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.506858 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8lqnf"] Feb 17 17:02:44 crc kubenswrapper[4694]: I0217 17:02:44.905572 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" path="/var/lib/kubelet/pods/2cd132a1-e4c4-4588-a436-daa27b4a1a98/volumes" Feb 17 17:02:45 crc kubenswrapper[4694]: I0217 17:02:45.249444 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"fcf54650fa2f1d6e329f404bb062a1e6bf1ea8a0ccb9f96ab97985b7e6daf983"} Feb 17 17:02:45 crc kubenswrapper[4694]: I0217 17:02:45.249819 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"f427b68c07cf19ccd22048872d960d921772b376df0845127ebe9ef90b1432a2"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.264309 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"5c47f3103c130ecfda2e2fd65f8faec4560028077d0ac35422850a4a6cbdd3b3"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.266861 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"53ad265734e7ff99b1762329cde24e056a0c3d80b5dcd7fb032f5dbfbe112ea2"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.266883 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"7566f7e998461da4297b13eff684bd3e541b111b6e4f90276fd44f58ba530a44"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.266895 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"c01a88af0edd6d387a2f80ed95bc1b900b67f7a3efbf2eda9b0c057097187fae"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.266909 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"852493c7-97b4-4850-9ef3-44ec598d9d1a","Type":"ContainerStarted","Data":"648504cdc1beeeef6c86b285aa4318532b74fc0a7905748593a4dc969b3dd1a9"} Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.309872 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.467830054 podStartE2EDuration="44.309847194s" podCreationTimestamp="2026-02-17 17:02:02 +0000 UTC" firstStartedPulling="2026-02-17 17:02:35.86981927 +0000 UTC m=+1223.626894594" lastFinishedPulling="2026-02-17 17:02:44.71183641 +0000 UTC m=+1232.468911734" observedRunningTime="2026-02-17 17:02:46.300288618 +0000 UTC m=+1234.057363942" watchObservedRunningTime="2026-02-17 17:02:46.309847194 +0000 UTC m=+1234.066922528" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.574746 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575290 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="init" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575321 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="init" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575345 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01db982-56b8-4f3c-98f8-c21954640fce" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575360 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01db982-56b8-4f3c-98f8-c21954640fce" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575398 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42d3f8d-87a8-4499-a06e-4c0bd452ba66" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575413 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42d3f8d-87a8-4499-a06e-4c0bd452ba66" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575452 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04936e25-daf1-4d3a-8256-8e1c127688cb" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575465 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="04936e25-daf1-4d3a-8256-8e1c127688cb" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575484 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="dnsmasq-dns" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575497 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="dnsmasq-dns" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575518 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebd2d40-ad72-4e4d-9264-890f92641e9d" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575532 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebd2d40-ad72-4e4d-9264-890f92641e9d" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575557 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2483097f-291b-41d8-9428-6bca4956ae91" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575570 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2483097f-291b-41d8-9428-6bca4956ae91" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: E0217 17:02:46.575599 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7502f1-dba9-43c8-81b3-8516714bca75" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575649 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7502f1-dba9-43c8-81b3-8516714bca75" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575936 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebd2d40-ad72-4e4d-9264-890f92641e9d" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575965 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd132a1-e4c4-4588-a436-daa27b4a1a98" containerName="dnsmasq-dns" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.575985 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="04936e25-daf1-4d3a-8256-8e1c127688cb" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.576001 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2483097f-291b-41d8-9428-6bca4956ae91" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.576020 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01db982-56b8-4f3c-98f8-c21954640fce" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.576041 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7502f1-dba9-43c8-81b3-8516714bca75" containerName="mariadb-account-create-update" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.576072 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42d3f8d-87a8-4499-a06e-4c0bd452ba66" containerName="mariadb-database-create" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.577887 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.580411 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.584785 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.743571 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.743743 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.743776 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.744668 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64t5d\" (UniqueName: \"kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.744748 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.744788 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846146 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846201 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846292 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64t5d\" (UniqueName: \"kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846323 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846373 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846417 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.846944 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.847039 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.847238 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.847254 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.847474 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.863704 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64t5d\" (UniqueName: \"kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d\") pod \"dnsmasq-dns-895cf5cf-89x4r\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:46 crc kubenswrapper[4694]: I0217 17:02:46.905173 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:47 crc kubenswrapper[4694]: I0217 17:02:47.283092 4694 generic.go:334] "Generic (PLEG): container finished" podID="4782973d-718d-4b2a-9b1e-84dfcfbafced" containerID="c5fcd645551e1c3cdb5d945297b25cba32fce983fcf514f2bd71f5eaff0f4d21" exitCode=0 Feb 17 17:02:47 crc kubenswrapper[4694]: I0217 17:02:47.283165 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w7fks" event={"ID":"4782973d-718d-4b2a-9b1e-84dfcfbafced","Type":"ContainerDied","Data":"c5fcd645551e1c3cdb5d945297b25cba32fce983fcf514f2bd71f5eaff0f4d21"} Feb 17 17:02:47 crc kubenswrapper[4694]: I0217 17:02:47.350021 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:47 crc kubenswrapper[4694]: W0217 17:02:47.353014 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f29c44a_e48e_405a_b7ce_06ad6c46d70c.slice/crio-90cc9895407da703c5896d6477a6e66c5136beb278935b935861486f2cbc2b38 WatchSource:0}: Error finding container 90cc9895407da703c5896d6477a6e66c5136beb278935b935861486f2cbc2b38: Status 404 returned error can't find the container with id 90cc9895407da703c5896d6477a6e66c5136beb278935b935861486f2cbc2b38 Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.291601 4694 generic.go:334] "Generic (PLEG): container finished" podID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerID="27ebfa4682757e22c479d4d3db486c068a098c8b0817e649b0ceed1ebd7d9738" exitCode=0 Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.291721 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" event={"ID":"4f29c44a-e48e-405a-b7ce-06ad6c46d70c","Type":"ContainerDied","Data":"27ebfa4682757e22c479d4d3db486c068a098c8b0817e649b0ceed1ebd7d9738"} Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.292019 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" event={"ID":"4f29c44a-e48e-405a-b7ce-06ad6c46d70c","Type":"ContainerStarted","Data":"90cc9895407da703c5896d6477a6e66c5136beb278935b935861486f2cbc2b38"} Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.567446 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.678727 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data\") pod \"4782973d-718d-4b2a-9b1e-84dfcfbafced\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.678825 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle\") pod \"4782973d-718d-4b2a-9b1e-84dfcfbafced\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.679014 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57ldp\" (UniqueName: \"kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp\") pod \"4782973d-718d-4b2a-9b1e-84dfcfbafced\" (UID: \"4782973d-718d-4b2a-9b1e-84dfcfbafced\") " Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.684970 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp" (OuterVolumeSpecName: "kube-api-access-57ldp") pod "4782973d-718d-4b2a-9b1e-84dfcfbafced" (UID: "4782973d-718d-4b2a-9b1e-84dfcfbafced"). InnerVolumeSpecName "kube-api-access-57ldp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.702259 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4782973d-718d-4b2a-9b1e-84dfcfbafced" (UID: "4782973d-718d-4b2a-9b1e-84dfcfbafced"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.745275 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data" (OuterVolumeSpecName: "config-data") pod "4782973d-718d-4b2a-9b1e-84dfcfbafced" (UID: "4782973d-718d-4b2a-9b1e-84dfcfbafced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.780973 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.781003 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4782973d-718d-4b2a-9b1e-84dfcfbafced-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:48 crc kubenswrapper[4694]: I0217 17:02:48.781014 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57ldp\" (UniqueName: \"kubernetes.io/projected/4782973d-718d-4b2a-9b1e-84dfcfbafced-kube-api-access-57ldp\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.302845 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" event={"ID":"4f29c44a-e48e-405a-b7ce-06ad6c46d70c","Type":"ContainerStarted","Data":"c86772b977985e1a43c6ffe7609e58ddf8a11dd0a9c5611cd33d6ff6c09629b6"} Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.303011 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.307892 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w7fks" event={"ID":"4782973d-718d-4b2a-9b1e-84dfcfbafced","Type":"ContainerDied","Data":"0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42"} Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.307936 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cf861aad946b4ffcf72a886a6b7a34c4622d2e0d89d135914bda01d6e994d42" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.307971 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w7fks" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.343114 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" podStartSLOduration=3.343077234 podStartE2EDuration="3.343077234s" podCreationTimestamp="2026-02-17 17:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:49.331626763 +0000 UTC m=+1237.088702087" watchObservedRunningTime="2026-02-17 17:02:49.343077234 +0000 UTC m=+1237.100152558" Feb 17 17:02:49 crc kubenswrapper[4694]: E0217 17:02:49.419436 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.630660 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n2mgs"] Feb 17 17:02:49 crc kubenswrapper[4694]: E0217 17:02:49.631325 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4782973d-718d-4b2a-9b1e-84dfcfbafced" containerName="keystone-db-sync" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.631348 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4782973d-718d-4b2a-9b1e-84dfcfbafced" containerName="keystone-db-sync" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.631545 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4782973d-718d-4b2a-9b1e-84dfcfbafced" containerName="keystone-db-sync" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.632190 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.634125 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.634441 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z26nq" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.634730 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.636425 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.638345 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.645704 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2mgs"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.662414 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.720977 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.722238 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.747977 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.814689 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.817854 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.824346 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.824716 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.824848 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.837690 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841568 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pqd\" (UniqueName: \"kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841711 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841796 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841886 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841914 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841938 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841953 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwjss\" (UniqueName: \"kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.841971 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.842028 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.842047 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.842103 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.842121 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.850121 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-xv9gp" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.927567 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x8w9s"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.928834 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.932516 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bmt67" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.932860 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.934378 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945069 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945134 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945190 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wjfv\" (UniqueName: \"kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945216 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945238 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945252 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945270 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945285 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwjss\" (UniqueName: \"kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945299 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945329 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945344 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945368 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945395 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945411 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945427 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945446 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.945463 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pqd\" (UniqueName: \"kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.946734 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.948230 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.948797 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.950241 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.951009 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.953065 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x8w9s"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.975997 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dgmfl"] Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.976991 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.979008 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.984181 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.985167 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.988065 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.988217 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 17:02:49 crc kubenswrapper[4694]: I0217 17:02:49.988346 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ngrvm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:49.999981 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.000355 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.005497 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dgmfl"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.008001 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pqd\" (UniqueName: \"kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd\") pod \"dnsmasq-dns-6c9c9f998c-w9jm9\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.029579 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwjss\" (UniqueName: \"kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss\") pod \"keystone-bootstrap-n2mgs\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046706 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046756 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046795 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046823 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046853 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046869 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046921 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8vqv\" (UniqueName: \"kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046950 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.046968 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.047005 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wjfv\" (UniqueName: \"kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.047025 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.047388 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.048355 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.049255 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.060666 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.061439 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.063226 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.093332 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wjfv\" (UniqueName: \"kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv\") pod \"horizon-dfc8b5bfc-j4v8j\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.109518 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.111720 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.124500 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.194379 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.195571 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-swggc"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.197554 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.208072 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.208315 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.216816 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wmtwl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.233820 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8vqv\" (UniqueName: \"kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.233919 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.233962 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.233996 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.234034 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gsf\" (UniqueName: \"kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.235715 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.235945 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.242313 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.246887 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.251173 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.251221 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.251777 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rzpn\" (UniqueName: \"kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.251829 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252256 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252296 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252323 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252378 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252413 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252460 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252545 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.252570 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.273968 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.275249 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.277807 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-swggc"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.301094 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.302253 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8vqv\" (UniqueName: \"kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.302647 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data\") pod \"cinder-db-sync-x8w9s\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.309945 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.341902 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-c88gm"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.361220 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.363291 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364043 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364076 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364136 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364650 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364707 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gsf\" (UniqueName: \"kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364763 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftg9\" (UniqueName: \"kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364795 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rzpn\" (UniqueName: \"kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364830 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364870 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364894 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364916 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364942 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.364987 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365009 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365065 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365097 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7tc\" (UniqueName: \"kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365193 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365219 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365239 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.365279 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.373450 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.374989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.375071 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.376384 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k6rt9" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.381734 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.392438 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.404823 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.407997 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.413980 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.419026 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.425913 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.427330 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rzpn\" (UniqueName: \"kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn\") pod \"dnsmasq-dns-57c957c4ff-jlvk2\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.428615 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.441001 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fp2f" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.441205 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.441238 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gsf\" (UniqueName: \"kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf\") pod \"neutron-db-sync-dgmfl\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.480571 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.480898 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.480928 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.480961 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftg9\" (UniqueName: \"kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.480988 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vwg\" (UniqueName: \"kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481020 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481053 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481075 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481095 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481138 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481190 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7tc\" (UniqueName: \"kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481237 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481264 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481297 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.481335 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.487601 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.490067 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.491161 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.491410 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.497532 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.505815 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.506956 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.507799 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.508390 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.512002 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.512592 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.516858 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c88gm"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.527128 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.530454 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftg9\" (UniqueName: \"kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9\") pod \"placement-db-sync-swggc\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.545568 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7tc\" (UniqueName: \"kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc\") pod \"ceilometer-0\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.563669 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.564623 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.565072 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.570656 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.590793 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.591103 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4wr\" (UniqueName: \"kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.591233 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.591389 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.592116 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.592364 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.592730 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.592922 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vwg\" (UniqueName: \"kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.593082 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.595976 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.596282 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.597452 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.628382 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.643101 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vwg\" (UniqueName: \"kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.647366 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data\") pod \"barbican-db-sync-c88gm\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.694202 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-swggc" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701505 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701543 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701589 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701624 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701655 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4wr\" (UniqueName: \"kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701674 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674q2\" (UniqueName: \"kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701704 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701730 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.701751 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.704696 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.704843 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.704892 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.706099 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.713358 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.718591 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.722036 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.724833 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.726375 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.726601 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.728260 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.732171 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.745573 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4wr\" (UniqueName: \"kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.748129 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.781167 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c88gm" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.799334 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.808689 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.808730 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.808792 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674q2\" (UniqueName: \"kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.808819 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.808849 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.809590 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.812492 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.812721 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.821421 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.825846 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.844676 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674q2\" (UniqueName: \"kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2\") pod \"horizon-894447f7f-w5grh\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.867517 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:50 crc kubenswrapper[4694]: I0217 17:02:50.920031 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.008108 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.009281 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.009358 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.019143 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.026826 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.129805 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130159 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4kg\" (UniqueName: \"kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130275 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130303 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130333 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130359 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130437 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.130485 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.231860 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.231916 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.231947 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.231970 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.232024 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.232061 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.232120 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.232160 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4kg\" (UniqueName: \"kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.232970 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.233569 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.241445 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.242252 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.247222 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.247407 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.247589 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.273303 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4kg\" (UniqueName: \"kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.316478 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.361862 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.362444 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" event={"ID":"35746db3-691f-41a1-8421-b9f11fd1d766","Type":"ContainerStarted","Data":"04297654cebf302542bfb270575a825bde39d0133928d1585fb0e1a9794feedb"} Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.362701 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="dnsmasq-dns" containerID="cri-o://c86772b977985e1a43c6ffe7609e58ddf8a11dd0a9c5611cd33d6ff6c09629b6" gracePeriod=10 Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.433409 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:02:51 crc kubenswrapper[4694]: W0217 17:02:51.635904 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3ff074b_45bc_4b82_89cf_b42f4b5991e1.slice/crio-d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254 WatchSource:0}: Error finding container d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254: Status 404 returned error can't find the container with id d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254 Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.649538 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x8w9s"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.674436 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-swggc"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.688365 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dgmfl"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.695676 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:02:51 crc kubenswrapper[4694]: W0217 17:02:51.738266 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode23514ea_6a1f_433d_ab93_663bd65629d2.slice/crio-4edb5aada38db9867c0022bfd4e86c7714541273160468e4ae13cc3ca67b880f WatchSource:0}: Error finding container 4edb5aada38db9867c0022bfd4e86c7714541273160468e4ae13cc3ca67b880f: Status 404 returned error can't find the container with id 4edb5aada38db9867c0022bfd4e86c7714541273160468e4ae13cc3ca67b880f Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.850366 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.857998 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c88gm"] Feb 17 17:02:51 crc kubenswrapper[4694]: I0217 17:02:51.889900 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n2mgs"] Feb 17 17:02:52 crc kubenswrapper[4694]: I0217 17:02:52.040318 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.122374 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.138339 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.195409 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.202503 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.232970 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.250527 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.268774 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.296967 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.367673 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.367738 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgp2\" (UniqueName: \"kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.367775 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.367813 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.367861 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.372995 4694 generic.go:334] "Generic (PLEG): container finished" podID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerID="c86772b977985e1a43c6ffe7609e58ddf8a11dd0a9c5611cd33d6ff6c09629b6" exitCode=0 Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.373053 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" event={"ID":"4f29c44a-e48e-405a-b7ce-06ad6c46d70c","Type":"ContainerDied","Data":"c86772b977985e1a43c6ffe7609e58ddf8a11dd0a9c5611cd33d6ff6c09629b6"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.374123 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerStarted","Data":"4edb5aada38db9867c0022bfd4e86c7714541273160468e4ae13cc3ca67b880f"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.375029 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfc8b5bfc-j4v8j" event={"ID":"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a","Type":"ContainerStarted","Data":"8027f8872a538a0b6bcdb9d1c09d19ae2061ffb0f19fa32e282c3a66b06c5845"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.377586 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-swggc" event={"ID":"8994bf6c-4617-4837-a8a2-4d399f187abb","Type":"ContainerStarted","Data":"b72887e6687200e38dcb9bbc1ac1c5b7b1ffbd584e0a4f3c904dcb4cd054f061"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.378841 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dgmfl" event={"ID":"15ec1579-807c-4af0-8332-9a52733beed0","Type":"ContainerStarted","Data":"7427e03f2d84fa3e66b045872908e7c6a5944e358029d5da83d818349c8cd312"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.379671 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x8w9s" event={"ID":"a3ff074b-45bc-4b82-89cf-b42f4b5991e1","Type":"ContainerStarted","Data":"d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.469329 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.469431 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgp2\" (UniqueName: \"kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.469485 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.469534 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.469601 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.470022 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.470746 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.471096 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.476312 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.498259 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgp2\" (UniqueName: \"kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2\") pod \"horizon-d99d85789-zsrm7\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:52.557234 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:53.228294 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.720533 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06daa413_9d80_4e00_a276_d84c2e15a56f.slice/crio-f9b92ee1b351ec300e8a7962102dda9f1b78f18689a677253479011157123725 WatchSource:0}: Error finding container f9b92ee1b351ec300e8a7962102dda9f1b78f18689a677253479011157123725: Status 404 returned error can't find the container with id f9b92ee1b351ec300e8a7962102dda9f1b78f18689a677253479011157123725 Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.722990 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaac0bee_f5f9_49c0_b880_6c57d412972e.slice/crio-13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e WatchSource:0}: Error finding container 13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e: Status 404 returned error can't find the container with id 13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.723515 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d06bc9_36b6_46b3_83d9_49335aa9c01e.slice/crio-a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c WatchSource:0}: Error finding container a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c: Status 404 returned error can't find the container with id a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.731697 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83185784_bd77_41d3_a0da_28fa4fabf086.slice/crio-36ba35d4275d4f4d602a1872bd288f96cb024f5e55cbff69db00bf71b4a1f0eb WatchSource:0}: Error finding container 36ba35d4275d4f4d602a1872bd288f96cb024f5e55cbff69db00bf71b4a1f0eb: Status 404 returned error can't find the container with id 36ba35d4275d4f4d602a1872bd288f96cb024f5e55cbff69db00bf71b4a1f0eb Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.732107 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf38a1468_87fe_4c44_92a6_101d1c64c1ef.slice/crio-310fecd670c156c5b6776d939f23c14dd28f6df51b4385af029e4364a1920813 WatchSource:0}: Error finding container 310fecd670c156c5b6776d939f23c14dd28f6df51b4385af029e4364a1920813: Status 404 returned error can't find the container with id 310fecd670c156c5b6776d939f23c14dd28f6df51b4385af029e4364a1920813 Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:54.734307 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684e88c8_a0bb_4593_b312_3b482b2b22e5.slice/crio-46177b14afa0ccd50cb43bfce90625c7e1e245d61ea8468320450686ed9bcef0 WatchSource:0}: Error finding container 46177b14afa0ccd50cb43bfce90625c7e1e245d61ea8468320450686ed9bcef0: Status 404 returned error can't find the container with id 46177b14afa0ccd50cb43bfce90625c7e1e245d61ea8468320450686ed9bcef0 Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.402174 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c88gm" event={"ID":"aaac0bee-f5f9-49c0-b880-6c57d412972e","Type":"ContainerStarted","Data":"13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.403729 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-894447f7f-w5grh" event={"ID":"83185784-bd77-41d3-a0da-28fa4fabf086","Type":"ContainerStarted","Data":"36ba35d4275d4f4d602a1872bd288f96cb024f5e55cbff69db00bf71b4a1f0eb"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.405097 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" event={"ID":"35746db3-691f-41a1-8421-b9f11fd1d766","Type":"ContainerStarted","Data":"0694cb02322fb3a9e09862f803482ccb1b37592b3eb7ac7854984e73d2e974ec"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.406323 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerStarted","Data":"310fecd670c156c5b6776d939f23c14dd28f6df51b4385af029e4364a1920813"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.407261 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2mgs" event={"ID":"a0d06bc9-36b6-46b3-83d9-49335aa9c01e","Type":"ContainerStarted","Data":"a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.408507 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerStarted","Data":"46177b14afa0ccd50cb43bfce90625c7e1e245d61ea8468320450686ed9bcef0"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.409558 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" event={"ID":"06daa413-9d80-4e00-a276-d84c2e15a56f","Type":"ContainerStarted","Data":"f9b92ee1b351ec300e8a7962102dda9f1b78f18689a677253479011157123725"} Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.835227 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:02:55 crc kubenswrapper[4694]: W0217 17:02:55.860490 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e4f3c9b_0a3f_47e5_80d0_ba2a452ff260.slice/crio-a05c1a983adf9eb5dae115c0763be198847408565dcd18eebc362f90f3d4849f WatchSource:0}: Error finding container a05c1a983adf9eb5dae115c0763be198847408565dcd18eebc362f90f3d4849f: Status 404 returned error can't find the container with id a05c1a983adf9eb5dae115c0763be198847408565dcd18eebc362f90f3d4849f Feb 17 17:02:55 crc kubenswrapper[4694]: I0217 17:02:55.961995 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036120 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036192 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036267 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036370 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036409 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64t5d\" (UniqueName: \"kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.036436 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config\") pod \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\" (UID: \"4f29c44a-e48e-405a-b7ce-06ad6c46d70c\") " Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.061974 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d" (OuterVolumeSpecName: "kube-api-access-64t5d") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "kube-api-access-64t5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.115565 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.117479 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config" (OuterVolumeSpecName: "config") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.125563 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.126174 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.127987 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f29c44a-e48e-405a-b7ce-06ad6c46d70c" (UID: "4f29c44a-e48e-405a-b7ce-06ad6c46d70c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141118 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141149 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64t5d\" (UniqueName: \"kubernetes.io/projected/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-kube-api-access-64t5d\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141160 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141170 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141180 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.141187 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f29c44a-e48e-405a-b7ce-06ad6c46d70c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.427692 4694 generic.go:334] "Generic (PLEG): container finished" podID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerID="1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd" exitCode=0 Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.427763 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" event={"ID":"06daa413-9d80-4e00-a276-d84c2e15a56f","Type":"ContainerDied","Data":"1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.432026 4694 generic.go:334] "Generic (PLEG): container finished" podID="35746db3-691f-41a1-8421-b9f11fd1d766" containerID="0694cb02322fb3a9e09862f803482ccb1b37592b3eb7ac7854984e73d2e974ec" exitCode=0 Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.432134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" event={"ID":"35746db3-691f-41a1-8421-b9f11fd1d766","Type":"ContainerDied","Data":"0694cb02322fb3a9e09862f803482ccb1b37592b3eb7ac7854984e73d2e974ec"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.435216 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dgmfl" event={"ID":"15ec1579-807c-4af0-8332-9a52733beed0","Type":"ContainerStarted","Data":"39724c483ff231898372944df1f10912b1ad27b5bcae744ea3bdc174bada0df2"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.438482 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerStarted","Data":"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.451705 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2mgs" event={"ID":"a0d06bc9-36b6-46b3-83d9-49335aa9c01e","Type":"ContainerStarted","Data":"d4f43c456a260da5382abe58881b26520dcdec039fdd492e8b142758c6c5c7ab"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.455740 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerStarted","Data":"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.462689 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d99d85789-zsrm7" event={"ID":"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260","Type":"ContainerStarted","Data":"a05c1a983adf9eb5dae115c0763be198847408565dcd18eebc362f90f3d4849f"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.465704 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" event={"ID":"4f29c44a-e48e-405a-b7ce-06ad6c46d70c","Type":"ContainerDied","Data":"90cc9895407da703c5896d6477a6e66c5136beb278935b935861486f2cbc2b38"} Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.465754 4694 scope.go:117] "RemoveContainer" containerID="c86772b977985e1a43c6ffe7609e58ddf8a11dd0a9c5611cd33d6ff6c09629b6" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.465780 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-89x4r" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.495482 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dgmfl" podStartSLOduration=7.495458458 podStartE2EDuration="7.495458458s" podCreationTimestamp="2026-02-17 17:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:56.462896017 +0000 UTC m=+1244.219971351" watchObservedRunningTime="2026-02-17 17:02:56.495458458 +0000 UTC m=+1244.252533802" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.588560 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n2mgs" podStartSLOduration=7.588535109 podStartE2EDuration="7.588535109s" podCreationTimestamp="2026-02-17 17:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:56.533745581 +0000 UTC m=+1244.290820895" watchObservedRunningTime="2026-02-17 17:02:56.588535109 +0000 UTC m=+1244.345610433" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.630660 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.641594 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-89x4r"] Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.668403 4694 scope.go:117] "RemoveContainer" containerID="27ebfa4682757e22c479d4d3db486c068a098c8b0817e649b0ceed1ebd7d9738" Feb 17 17:02:56 crc kubenswrapper[4694]: I0217 17:02:56.942322 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" path="/var/lib/kubelet/pods/4f29c44a-e48e-405a-b7ce-06ad6c46d70c/volumes" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.027516 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163529 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pqd\" (UniqueName: \"kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163651 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163699 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163737 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163761 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.163847 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config\") pod \"35746db3-691f-41a1-8421-b9f11fd1d766\" (UID: \"35746db3-691f-41a1-8421-b9f11fd1d766\") " Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.186393 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config" (OuterVolumeSpecName: "config") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.188783 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd" (OuterVolumeSpecName: "kube-api-access-45pqd") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "kube-api-access-45pqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.196503 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.197791 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.198088 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.205427 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35746db3-691f-41a1-8421-b9f11fd1d766" (UID: "35746db3-691f-41a1-8421-b9f11fd1d766"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265731 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265763 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pqd\" (UniqueName: \"kubernetes.io/projected/35746db3-691f-41a1-8421-b9f11fd1d766-kube-api-access-45pqd\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265774 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265789 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265798 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.265809 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35746db3-691f-41a1-8421-b9f11fd1d766-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.475931 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" event={"ID":"35746db3-691f-41a1-8421-b9f11fd1d766","Type":"ContainerDied","Data":"04297654cebf302542bfb270575a825bde39d0133928d1585fb0e1a9794feedb"} Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.475969 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-w9jm9" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.475993 4694 scope.go:117] "RemoveContainer" containerID="0694cb02322fb3a9e09862f803482ccb1b37592b3eb7ac7854984e73d2e974ec" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.480090 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerStarted","Data":"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b"} Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.480162 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-log" containerID="cri-o://a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" gracePeriod=30 Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.480256 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-httpd" containerID="cri-o://f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" gracePeriod=30 Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.483094 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerStarted","Data":"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d"} Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.483217 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-log" containerID="cri-o://cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" gracePeriod=30 Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.483226 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-httpd" containerID="cri-o://18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" gracePeriod=30 Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.493165 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" event={"ID":"06daa413-9d80-4e00-a276-d84c2e15a56f","Type":"ContainerStarted","Data":"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2"} Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.494012 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.510382 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.510364053 podStartE2EDuration="7.510364053s" podCreationTimestamp="2026-02-17 17:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:57.508032576 +0000 UTC m=+1245.265107900" watchObservedRunningTime="2026-02-17 17:02:57.510364053 +0000 UTC m=+1245.267439377" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.538975 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" podStartSLOduration=7.538957727 podStartE2EDuration="7.538957727s" podCreationTimestamp="2026-02-17 17:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:57.527255839 +0000 UTC m=+1245.284331173" watchObservedRunningTime="2026-02-17 17:02:57.538957727 +0000 UTC m=+1245.296033051" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.560221 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.5602031 podStartE2EDuration="8.5602031s" podCreationTimestamp="2026-02-17 17:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:02:57.55698414 +0000 UTC m=+1245.314059464" watchObservedRunningTime="2026-02-17 17:02:57.5602031 +0000 UTC m=+1245.317278424" Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.646526 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:57 crc kubenswrapper[4694]: I0217 17:02:57.653757 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-w9jm9"] Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.318732 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397160 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397223 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397309 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4wr\" (UniqueName: \"kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397359 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397387 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397449 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397557 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.397653 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs\") pod \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\" (UID: \"f38a1468-87fe-4c44-92a6-101d1c64c1ef\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.398279 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.411152 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs" (OuterVolumeSpecName: "logs") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.413599 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts" (OuterVolumeSpecName: "scripts") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.420804 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.443897 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr" (OuterVolumeSpecName: "kube-api-access-hd4wr") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "kube-api-access-hd4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.499457 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.499503 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.499514 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4wr\" (UniqueName: \"kubernetes.io/projected/f38a1468-87fe-4c44-92a6-101d1c64c1ef-kube-api-access-hd4wr\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.499523 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f38a1468-87fe-4c44-92a6-101d1c64c1ef-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.499532 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.502080 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.502872 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data" (OuterVolumeSpecName: "config-data") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.512484 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532021 4694 generic.go:334] "Generic (PLEG): container finished" podID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerID="f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" exitCode=0 Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532052 4694 generic.go:334] "Generic (PLEG): container finished" podID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerID="a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" exitCode=143 Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532096 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerDied","Data":"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532127 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerDied","Data":"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532137 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f38a1468-87fe-4c44-92a6-101d1c64c1ef","Type":"ContainerDied","Data":"310fecd670c156c5b6776d939f23c14dd28f6df51b4385af029e4364a1920813"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532152 4694 scope.go:117] "RemoveContainer" containerID="f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532196 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.532245 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.535248 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f38a1468-87fe-4c44-92a6-101d1c64c1ef" (UID: "f38a1468-87fe-4c44-92a6-101d1c64c1ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.537062 4694 generic.go:334] "Generic (PLEG): container finished" podID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerID="18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" exitCode=0 Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.537125 4694 generic.go:334] "Generic (PLEG): container finished" podID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerID="cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" exitCode=143 Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.538239 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.538316 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerDied","Data":"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.538360 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerDied","Data":"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.538371 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"684e88c8-a0bb-4593-b312-3b482b2b22e5","Type":"ContainerDied","Data":"46177b14afa0ccd50cb43bfce90625c7e1e245d61ea8468320450686ed9bcef0"} Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.582386 4694 scope.go:117] "RemoveContainer" containerID="a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.602215 4694 scope.go:117] "RemoveContainer" containerID="f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" Feb 17 17:02:58 crc kubenswrapper[4694]: E0217 17:02:58.602941 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b\": container with ID starting with f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b not found: ID does not exist" containerID="f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.602991 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b"} err="failed to get container status \"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b\": rpc error: code = NotFound desc = could not find container \"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b\": container with ID starting with f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.603023 4694 scope.go:117] "RemoveContainer" containerID="a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" Feb 17 17:02:58 crc kubenswrapper[4694]: E0217 17:02:58.603413 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d\": container with ID starting with a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d not found: ID does not exist" containerID="a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.603448 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d"} err="failed to get container status \"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d\": rpc error: code = NotFound desc = could not find container \"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d\": container with ID starting with a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.603467 4694 scope.go:117] "RemoveContainer" containerID="f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604119 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604169 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604200 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604338 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf4kg\" (UniqueName: \"kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604466 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604547 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604584 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604627 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs\") pod \"684e88c8-a0bb-4593-b312-3b482b2b22e5\" (UID: \"684e88c8-a0bb-4593-b312-3b482b2b22e5\") " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.604878 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605312 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605337 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605349 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605360 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605373 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38a1468-87fe-4c44-92a6-101d1c64c1ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.605685 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs" (OuterVolumeSpecName: "logs") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.608900 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg" (OuterVolumeSpecName: "kube-api-access-gf4kg") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "kube-api-access-gf4kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.609136 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.610098 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts" (OuterVolumeSpecName: "scripts") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.619214 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b"} err="failed to get container status \"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b\": rpc error: code = NotFound desc = could not find container \"f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b\": container with ID starting with f4f5cdbca0c4fbbfcce1e1636f5b0b69864c8768a52d1cac084eb25c5003497b not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.620137 4694 scope.go:117] "RemoveContainer" containerID="a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.622124 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d"} err="failed to get container status \"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d\": rpc error: code = NotFound desc = could not find container \"a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d\": container with ID starting with a406fb941ed580c757be600ee3b550d3ddd1d5536394ed70d638ff52515e108d not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.622349 4694 scope.go:117] "RemoveContainer" containerID="18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.640975 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.665938 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.666476 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data" (OuterVolumeSpecName: "config-data") pod "684e88c8-a0bb-4593-b312-3b482b2b22e5" (UID: "684e88c8-a0bb-4593-b312-3b482b2b22e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.693617 4694 scope.go:117] "RemoveContainer" containerID="cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707721 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707755 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707767 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684e88c8-a0bb-4593-b312-3b482b2b22e5-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707777 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707788 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684e88c8-a0bb-4593-b312-3b482b2b22e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707796 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf4kg\" (UniqueName: \"kubernetes.io/projected/684e88c8-a0bb-4593-b312-3b482b2b22e5-kube-api-access-gf4kg\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.707835 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.717483 4694 scope.go:117] "RemoveContainer" containerID="18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" Feb 17 17:02:58 crc kubenswrapper[4694]: E0217 17:02:58.718003 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d\": container with ID starting with 18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d not found: ID does not exist" containerID="18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.718045 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d"} err="failed to get container status \"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d\": rpc error: code = NotFound desc = could not find container \"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d\": container with ID starting with 18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.718072 4694 scope.go:117] "RemoveContainer" containerID="cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" Feb 17 17:02:58 crc kubenswrapper[4694]: E0217 17:02:58.718338 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea\": container with ID starting with cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea not found: ID does not exist" containerID="cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.718401 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea"} err="failed to get container status \"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea\": rpc error: code = NotFound desc = could not find container \"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea\": container with ID starting with cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.718432 4694 scope.go:117] "RemoveContainer" containerID="18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.718857 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d"} err="failed to get container status \"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d\": rpc error: code = NotFound desc = could not find container \"18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d\": container with ID starting with 18bdb136194166d95c82500dfb9171fd28807277685d5bb995ec45b320efac1d not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.719003 4694 scope.go:117] "RemoveContainer" containerID="cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.720019 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea"} err="failed to get container status \"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea\": rpc error: code = NotFound desc = could not find container \"cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea\": container with ID starting with cd39f0f32a75ce431809d959a8274ff58fa3dd7bffcbf158bb4867c5c748b9ea not found: ID does not exist" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.728975 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.809751 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.938174 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35746db3-691f-41a1-8421-b9f11fd1d766" path="/var/lib/kubelet/pods/35746db3-691f-41a1-8421-b9f11fd1d766/volumes" Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.938840 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.938874 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.965907 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:58 crc kubenswrapper[4694]: I0217 17:02:58.977401 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.005643 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.026890 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027342 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.027360 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027690 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="init" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.027703 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="init" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027718 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35746db3-691f-41a1-8421-b9f11fd1d766" containerName="init" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.027726 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="35746db3-691f-41a1-8421-b9f11fd1d766" containerName="init" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027735 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.027769 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027786 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="dnsmasq-dns" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.027793 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="dnsmasq-dns" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.027814 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030155 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.030214 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030225 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030560 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f29c44a-e48e-405a-b7ce-06ad6c46d70c" containerName="dnsmasq-dns" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030580 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030593 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-log" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030623 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030641 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="35746db3-691f-41a1-8421-b9f11fd1d766" containerName="init" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.030650 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" containerName="glance-httpd" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.033689 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.038520 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fp2f" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.038636 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.038731 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.038778 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.038795 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.051620 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.053106 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.056310 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.056530 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.068920 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.084191 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.085937 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.088147 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.109774 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.110601 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-j4pnz logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="ddd53f01-1931-49a0-9d70-dc8e98399e80" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120724 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120795 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120818 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120836 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120867 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120909 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120946 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120966 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.120993 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121209 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121259 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pnz\" (UniqueName: \"kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121348 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121420 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121458 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrrfn\" (UniqueName: \"kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.121568 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.122756 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.140214 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.140845 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-rrrfn logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="d5ec3d57-1dd9-4182-9e58-1136c551ad6e" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.209662 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226643 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226692 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226714 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226731 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226767 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226794 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226810 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226830 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226851 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226869 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226884 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49lb\" (UniqueName: \"kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226904 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226919 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226937 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226954 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pnz\" (UniqueName: \"kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.226981 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227007 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227024 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227043 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrrfn\" (UniqueName: \"kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227062 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227080 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227119 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.227141 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.228077 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.229910 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.230075 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.230214 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.230319 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.231480 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b8f4f9856-rcwl9"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.233244 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.233668 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.234402 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.234625 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.235706 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.236244 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.237120 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.242413 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.244806 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.252367 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.253100 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrrfn\" (UniqueName: \"kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.254314 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8f4f9856-rcwl9"] Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.256083 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pnz\" (UniqueName: \"kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.285571 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.307883 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.328954 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-scripts\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329015 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-secret-key\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329048 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4kn\" (UniqueName: \"kubernetes.io/projected/17711b82-3f49-41da-b17d-785c70869492-kube-api-access-tq4kn\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329088 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-combined-ca-bundle\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329123 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329146 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329175 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-tls-certs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329198 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329221 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d49lb\" (UniqueName: \"kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329280 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17711b82-3f49-41da-b17d-785c70869492-logs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329301 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329326 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329347 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.329363 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-config-data\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.330492 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.350983 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.351241 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.352165 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.360790 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.361492 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.365217 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49lb\" (UniqueName: \"kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb\") pod \"horizon-757dbcd46d-pw2kl\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.420859 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.432813 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4kn\" (UniqueName: \"kubernetes.io/projected/17711b82-3f49-41da-b17d-785c70869492-kube-api-access-tq4kn\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.432862 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-combined-ca-bundle\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.432955 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-tls-certs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.433082 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17711b82-3f49-41da-b17d-785c70869492-logs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.433142 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-config-data\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.433230 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-scripts\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.433941 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17711b82-3f49-41da-b17d-785c70869492-logs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.434964 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-config-data\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.435127 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-secret-key\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.435853 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17711b82-3f49-41da-b17d-785c70869492-scripts\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.438086 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-tls-certs\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.438584 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-horizon-secret-key\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.454448 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4kn\" (UniqueName: \"kubernetes.io/projected/17711b82-3f49-41da-b17d-785c70869492-kube-api-access-tq4kn\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.459732 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17711b82-3f49-41da-b17d-785c70869492-combined-ca-bundle\") pod \"horizon-7b8f4f9856-rcwl9\" (UID: \"17711b82-3f49-41da-b17d-785c70869492\") " pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.470651 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.565507 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.566220 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.578541 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.588665 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642095 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642150 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4pnz\" (UniqueName: \"kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642243 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642260 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642284 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642299 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrrfn\" (UniqueName: \"kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642397 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642442 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642466 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642498 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642542 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642562 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642581 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642601 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run\") pod \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\" (UID: \"d5ec3d57-1dd9-4182-9e58-1136c551ad6e\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642674 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.642690 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs\") pod \"ddd53f01-1931-49a0-9d70-dc8e98399e80\" (UID: \"ddd53f01-1931-49a0-9d70-dc8e98399e80\") " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.647700 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs" (OuterVolumeSpecName: "logs") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.650855 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.653502 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.653659 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data" (OuterVolumeSpecName: "config-data") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.653672 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn" (OuterVolumeSpecName: "kube-api-access-rrrfn") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "kube-api-access-rrrfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.653930 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs" (OuterVolumeSpecName: "logs") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.653973 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.654144 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.655056 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts" (OuterVolumeSpecName: "scripts") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.655291 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz" (OuterVolumeSpecName: "kube-api-access-j4pnz") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "kube-api-access-j4pnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.655583 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.656177 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.658460 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data" (OuterVolumeSpecName: "config-data") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.658877 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.672641 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d5ec3d57-1dd9-4182-9e58-1136c551ad6e" (UID: "d5ec3d57-1dd9-4182-9e58-1136c551ad6e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.675214 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts" (OuterVolumeSpecName: "scripts") pod "ddd53f01-1931-49a0-9d70-dc8e98399e80" (UID: "ddd53f01-1931-49a0-9d70-dc8e98399e80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:02:59 crc kubenswrapper[4694]: E0217 17:02:59.730855 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744839 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744884 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744896 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744908 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744918 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744927 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744935 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744943 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744951 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744959 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4pnz\" (UniqueName: \"kubernetes.io/projected/ddd53f01-1931-49a0-9d70-dc8e98399e80-kube-api-access-j4pnz\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744969 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744986 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.744995 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.745003 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrrfn\" (UniqueName: \"kubernetes.io/projected/d5ec3d57-1dd9-4182-9e58-1136c551ad6e-kube-api-access-rrrfn\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.745012 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd53f01-1931-49a0-9d70-dc8e98399e80-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.745019 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd53f01-1931-49a0-9d70-dc8e98399e80-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.761742 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.763132 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.845991 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:02:59 crc kubenswrapper[4694]: I0217 17:02:59.846029 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.073189 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8f4f9856-rcwl9"] Feb 17 17:03:00 crc kubenswrapper[4694]: W0217 17:03:00.097501 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17711b82_3f49_41da_b17d_785c70869492.slice/crio-f13392b226d6a3d26aa53496863ea93389fae0115153b1c4ec59b5edc943cb4e WatchSource:0}: Error finding container f13392b226d6a3d26aa53496863ea93389fae0115153b1c4ec59b5edc943cb4e: Status 404 returned error can't find the container with id f13392b226d6a3d26aa53496863ea93389fae0115153b1c4ec59b5edc943cb4e Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.212681 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.579036 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f4f9856-rcwl9" event={"ID":"17711b82-3f49-41da-b17d-785c70869492","Type":"ContainerStarted","Data":"f13392b226d6a3d26aa53496863ea93389fae0115153b1c4ec59b5edc943cb4e"} Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.581254 4694 generic.go:334] "Generic (PLEG): container finished" podID="a0d06bc9-36b6-46b3-83d9-49335aa9c01e" containerID="d4f43c456a260da5382abe58881b26520dcdec039fdd492e8b142758c6c5c7ab" exitCode=0 Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.581333 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.581747 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2mgs" event={"ID":"a0d06bc9-36b6-46b3-83d9-49335aa9c01e","Type":"ContainerDied","Data":"d4f43c456a260da5382abe58881b26520dcdec039fdd492e8b142758c6c5c7ab"} Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.582126 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.652838 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.681594 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.695047 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.696877 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.702813 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.702902 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.703162 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6fp2f" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.710731 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.720389 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772444 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772524 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772555 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d59j\" (UniqueName: \"kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772634 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772675 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772718 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772745 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.772774 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.773094 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.782744 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.852556 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.854389 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.856860 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.857062 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.866052 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876568 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876625 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876649 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876761 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876785 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d59j\" (UniqueName: \"kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876847 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.876879 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.883078 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.883316 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.883530 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.899363 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.903244 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.909286 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.913270 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.914834 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684e88c8-a0bb-4593-b312-3b482b2b22e5" path="/var/lib/kubelet/pods/684e88c8-a0bb-4593-b312-3b482b2b22e5/volumes" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.915905 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ec3d57-1dd9-4182-9e58-1136c551ad6e" path="/var/lib/kubelet/pods/d5ec3d57-1dd9-4182-9e58-1136c551ad6e/volumes" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.916487 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd53f01-1931-49a0-9d70-dc8e98399e80" path="/var/lib/kubelet/pods/ddd53f01-1931-49a0-9d70-dc8e98399e80/volumes" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.917522 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38a1468-87fe-4c44-92a6-101d1c64c1ef" path="/var/lib/kubelet/pods/f38a1468-87fe-4c44-92a6-101d1c64c1ef/volumes" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981223 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981289 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981333 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981392 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981413 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981467 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981505 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.981532 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsjlh\" (UniqueName: \"kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.983793 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:00 crc kubenswrapper[4694]: I0217 17:03:00.985543 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d59j\" (UniqueName: \"kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j\") pod \"glance-default-internal-api-0\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.022633 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083264 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083339 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083447 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083477 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083528 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083586 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083632 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsjlh\" (UniqueName: \"kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083685 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.083720 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.084900 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.085092 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.089291 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.089988 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.091474 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.092285 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.112197 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsjlh\" (UniqueName: \"kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.120882 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " pod="openstack/glance-default-external-api-0" Feb 17 17:03:01 crc kubenswrapper[4694]: I0217 17:03:01.183021 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:03:02 crc kubenswrapper[4694]: W0217 17:03:02.850954 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc102be_9643_4310_900a_c6f6803a395a.slice/crio-9716fa20f0d3ceeb6f2f08251d8443dd46d5ce0d9e00178fca9ee3d863432d1d WatchSource:0}: Error finding container 9716fa20f0d3ceeb6f2f08251d8443dd46d5ce0d9e00178fca9ee3d863432d1d: Status 404 returned error can't find the container with id 9716fa20f0d3ceeb6f2f08251d8443dd46d5ce0d9e00178fca9ee3d863432d1d Feb 17 17:03:02 crc kubenswrapper[4694]: I0217 17:03:02.985305 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041284 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041375 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041426 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041470 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041520 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.041601 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwjss\" (UniqueName: \"kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss\") pod \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\" (UID: \"a0d06bc9-36b6-46b3-83d9-49335aa9c01e\") " Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.047807 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.050894 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts" (OuterVolumeSpecName: "scripts") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.054834 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.058681 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss" (OuterVolumeSpecName: "kube-api-access-nwjss") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "kube-api-access-nwjss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.076995 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data" (OuterVolumeSpecName: "config-data") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.081601 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0d06bc9-36b6-46b3-83d9-49335aa9c01e" (UID: "a0d06bc9-36b6-46b3-83d9-49335aa9c01e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143556 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143591 4694 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143599 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143624 4694 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143632 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.143641 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwjss\" (UniqueName: \"kubernetes.io/projected/a0d06bc9-36b6-46b3-83d9-49335aa9c01e-kube-api-access-nwjss\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.609910 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerStarted","Data":"9716fa20f0d3ceeb6f2f08251d8443dd46d5ce0d9e00178fca9ee3d863432d1d"} Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.623005 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n2mgs" event={"ID":"a0d06bc9-36b6-46b3-83d9-49335aa9c01e","Type":"ContainerDied","Data":"a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c"} Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.623058 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n2mgs" Feb 17 17:03:03 crc kubenswrapper[4694]: I0217 17:03:03.623059 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c1799f23f77bc3a68f3a482e70656b13cc4013936b8901ddb39c6801abc07c" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.076477 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n2mgs"] Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.085820 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n2mgs"] Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.167243 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hhvxg"] Feb 17 17:03:04 crc kubenswrapper[4694]: E0217 17:03:04.168072 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d06bc9-36b6-46b3-83d9-49335aa9c01e" containerName="keystone-bootstrap" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.168150 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d06bc9-36b6-46b3-83d9-49335aa9c01e" containerName="keystone-bootstrap" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.168478 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d06bc9-36b6-46b3-83d9-49335aa9c01e" containerName="keystone-bootstrap" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.169575 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.174372 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.174489 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.174968 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.175232 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z26nq" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.175416 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.183271 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hhvxg"] Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.279121 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.279574 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.279803 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.279968 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfnp8\" (UniqueName: \"kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.280090 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.280273 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.382737 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.383079 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.383179 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfnp8\" (UniqueName: \"kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.383276 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.383429 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.383568 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.388761 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.397320 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.397516 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.397699 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.398293 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.405917 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfnp8\" (UniqueName: \"kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8\") pod \"keystone-bootstrap-hhvxg\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.488317 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:04 crc kubenswrapper[4694]: I0217 17:03:04.905063 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d06bc9-36b6-46b3-83d9-49335aa9c01e" path="/var/lib/kubelet/pods/a0d06bc9-36b6-46b3-83d9-49335aa9c01e/volumes" Feb 17 17:03:05 crc kubenswrapper[4694]: I0217 17:03:05.567063 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:03:05 crc kubenswrapper[4694]: I0217 17:03:05.631211 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:03:05 crc kubenswrapper[4694]: I0217 17:03:05.631429 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" containerID="cri-o://eacc3fdd7029d84917ca9abe1410eeb4a035073b8fb503a5e13127a7402d05fd" gracePeriod=10 Feb 17 17:03:06 crc kubenswrapper[4694]: I0217 17:03:06.664352 4694 generic.go:334] "Generic (PLEG): container finished" podID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerID="eacc3fdd7029d84917ca9abe1410eeb4a035073b8fb503a5e13127a7402d05fd" exitCode=0 Feb 17 17:03:06 crc kubenswrapper[4694]: I0217 17:03:06.664434 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" event={"ID":"6bc3eb44-a604-462b-8a1d-9bb52e39a18b","Type":"ContainerDied","Data":"eacc3fdd7029d84917ca9abe1410eeb4a035073b8fb503a5e13127a7402d05fd"} Feb 17 17:03:08 crc kubenswrapper[4694]: I0217 17:03:08.608138 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Feb 17 17:03:10 crc kubenswrapper[4694]: E0217 17:03:10.003245 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c3f4af_5b32_4e7f_a562_6fd529a1abaf.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.011473 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.012088 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579h5f9h684h696h97h674h668hdbh5c6hdh5cfh9dh56dhd4h66ch5f7h85h56ch684hb9hd6h66hb9hd5h8ch589hbbh8dh595h5bdh85hfcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wjfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-dfc8b5bfc-j4v8j_openstack(8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.015504 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-dfc8b5bfc-j4v8j" podUID="8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.028338 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.028472 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dfh67fh65fh567h5fh594h78h68bh8dhc7h5bdh67h5d4h688h586h558h69hd4h88hch86h579h646h544h6ch74h94h97h5c9h5fbh65bh5ffq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-674q2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-894447f7f-w5grh_openstack(83185784-bd77-41d3-a0da-28fa4fabf086): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.030080 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.030255 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h545h98h549hf4h84h5f4hcfh6dh5d6h55h67fhdbh574h657h597h5d5h5d6h5c8h664h5cbh659h6fh5fbh5c8h646hb4hb8h57h58h56fh5b4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qgp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-d99d85789-zsrm7_openstack(3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.030762 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-894447f7f-w5grh" podUID="83185784-bd77-41d3-a0da-28fa4fabf086" Feb 17 17:03:13 crc kubenswrapper[4694]: E0217 17:03:13.032799 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-d99d85789-zsrm7" podUID="3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" Feb 17 17:03:13 crc kubenswrapper[4694]: I0217 17:03:13.524517 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:03:13 crc kubenswrapper[4694]: I0217 17:03:13.607721 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Feb 17 17:03:16 crc kubenswrapper[4694]: I0217 17:03:16.742638 4694 generic.go:334] "Generic (PLEG): container finished" podID="15ec1579-807c-4af0-8332-9a52733beed0" containerID="39724c483ff231898372944df1f10912b1ad27b5bcae744ea3bdc174bada0df2" exitCode=0 Feb 17 17:03:16 crc kubenswrapper[4694]: I0217 17:03:16.742756 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dgmfl" event={"ID":"15ec1579-807c-4af0-8332-9a52733beed0","Type":"ContainerDied","Data":"39724c483ff231898372944df1f10912b1ad27b5bcae744ea3bdc174bada0df2"} Feb 17 17:03:18 crc kubenswrapper[4694]: I0217 17:03:18.606969 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Feb 17 17:03:18 crc kubenswrapper[4694]: I0217 17:03:18.607383 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:03:21 crc kubenswrapper[4694]: E0217 17:03:21.670034 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 17 17:03:21 crc kubenswrapper[4694]: E0217 17:03:21.670338 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbfh67dh577h556h9ch647h599h689h595h5ffh5cdhb4hcdh567h544h5b7h558h697h5bfh648h79h585h5bbh86h5ddh58dh658h5b7h5c5h54h554h59cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9z7tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e23514ea-6a1f-433d-ab93-663bd65629d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:22 crc kubenswrapper[4694]: E0217 17:03:22.133677 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 17:03:22 crc kubenswrapper[4694]: E0217 17:03:22.133827 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n2vwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-c88gm_openstack(aaac0bee-f5f9-49c0-b880-6c57d412972e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:22 crc kubenswrapper[4694]: E0217 17:03:22.135080 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-c88gm" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.235825 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.250983 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.256927 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.261952 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328272 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgp2\" (UniqueName: \"kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2\") pod \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328311 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs\") pod \"83185784-bd77-41d3-a0da-28fa4fabf086\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328336 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle\") pod \"15ec1579-807c-4af0-8332-9a52733beed0\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328363 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts\") pod \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328385 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data\") pod \"83185784-bd77-41d3-a0da-28fa4fabf086\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328407 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs\") pod \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328428 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data\") pod \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328453 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key\") pod \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\" (UID: \"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328521 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs\") pod \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328545 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key\") pod \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328568 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts\") pod \"83185784-bd77-41d3-a0da-28fa4fabf086\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328588 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wjfv\" (UniqueName: \"kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv\") pod \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328636 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gsf\" (UniqueName: \"kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf\") pod \"15ec1579-807c-4af0-8332-9a52733beed0\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328654 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts\") pod \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328688 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key\") pod \"83185784-bd77-41d3-a0da-28fa4fabf086\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328723 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674q2\" (UniqueName: \"kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2\") pod \"83185784-bd77-41d3-a0da-28fa4fabf086\" (UID: \"83185784-bd77-41d3-a0da-28fa4fabf086\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328739 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config\") pod \"15ec1579-807c-4af0-8332-9a52733beed0\" (UID: \"15ec1579-807c-4af0-8332-9a52733beed0\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328766 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data\") pod \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\" (UID: \"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a\") " Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.328907 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs" (OuterVolumeSpecName: "logs") pod "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" (UID: "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.329305 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.329697 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts" (OuterVolumeSpecName: "scripts") pod "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" (UID: "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.330228 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data" (OuterVolumeSpecName: "config-data") pod "83185784-bd77-41d3-a0da-28fa4fabf086" (UID: "83185784-bd77-41d3-a0da-28fa4fabf086"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.330724 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs" (OuterVolumeSpecName: "logs") pod "83185784-bd77-41d3-a0da-28fa4fabf086" (UID: "83185784-bd77-41d3-a0da-28fa4fabf086"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.330869 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data" (OuterVolumeSpecName: "config-data") pod "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" (UID: "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.331778 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs" (OuterVolumeSpecName: "logs") pod "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" (UID: "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.332005 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data" (OuterVolumeSpecName: "config-data") pod "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" (UID: "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.332952 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts" (OuterVolumeSpecName: "scripts") pod "83185784-bd77-41d3-a0da-28fa4fabf086" (UID: "83185784-bd77-41d3-a0da-28fa4fabf086"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.333802 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts" (OuterVolumeSpecName: "scripts") pod "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" (UID: "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.334878 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2" (OuterVolumeSpecName: "kube-api-access-8qgp2") pod "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" (UID: "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260"). InnerVolumeSpecName "kube-api-access-8qgp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.336080 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf" (OuterVolumeSpecName: "kube-api-access-b2gsf") pod "15ec1579-807c-4af0-8332-9a52733beed0" (UID: "15ec1579-807c-4af0-8332-9a52733beed0"). InnerVolumeSpecName "kube-api-access-b2gsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.338309 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "83185784-bd77-41d3-a0da-28fa4fabf086" (UID: "83185784-bd77-41d3-a0da-28fa4fabf086"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.338877 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv" (OuterVolumeSpecName: "kube-api-access-6wjfv") pod "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" (UID: "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a"). InnerVolumeSpecName "kube-api-access-6wjfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.339002 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2" (OuterVolumeSpecName: "kube-api-access-674q2") pod "83185784-bd77-41d3-a0da-28fa4fabf086" (UID: "83185784-bd77-41d3-a0da-28fa4fabf086"). InnerVolumeSpecName "kube-api-access-674q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.339139 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" (UID: "3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.343802 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" (UID: "8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.355876 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15ec1579-807c-4af0-8332-9a52733beed0" (UID: "15ec1579-807c-4af0-8332-9a52733beed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.358053 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config" (OuterVolumeSpecName: "config") pod "15ec1579-807c-4af0-8332-9a52733beed0" (UID: "15ec1579-807c-4af0-8332-9a52733beed0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430755 4694 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430792 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430803 4694 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430812 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430821 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wjfv\" (UniqueName: \"kubernetes.io/projected/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-kube-api-access-6wjfv\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430833 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gsf\" (UniqueName: \"kubernetes.io/projected/15ec1579-807c-4af0-8332-9a52733beed0-kube-api-access-b2gsf\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430842 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430852 4694 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/83185784-bd77-41d3-a0da-28fa4fabf086-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430860 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430868 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674q2\" (UniqueName: \"kubernetes.io/projected/83185784-bd77-41d3-a0da-28fa4fabf086-kube-api-access-674q2\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430876 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430885 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgp2\" (UniqueName: \"kubernetes.io/projected/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-kube-api-access-8qgp2\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430893 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83185784-bd77-41d3-a0da-28fa4fabf086-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430901 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ec1579-807c-4af0-8332-9a52733beed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430908 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430916 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83185784-bd77-41d3-a0da-28fa4fabf086-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.430924 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.793512 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-894447f7f-w5grh" event={"ID":"83185784-bd77-41d3-a0da-28fa4fabf086","Type":"ContainerDied","Data":"36ba35d4275d4f4d602a1872bd288f96cb024f5e55cbff69db00bf71b4a1f0eb"} Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.793602 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-894447f7f-w5grh" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.795554 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dfc8b5bfc-j4v8j" event={"ID":"8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a","Type":"ContainerDied","Data":"8027f8872a538a0b6bcdb9d1c09d19ae2061ffb0f19fa32e282c3a66b06c5845"} Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.795584 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dfc8b5bfc-j4v8j" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.817191 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dgmfl" event={"ID":"15ec1579-807c-4af0-8332-9a52733beed0","Type":"ContainerDied","Data":"7427e03f2d84fa3e66b045872908e7c6a5944e358029d5da83d818349c8cd312"} Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.817216 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dgmfl" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.817230 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7427e03f2d84fa3e66b045872908e7c6a5944e358029d5da83d818349c8cd312" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.819107 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerStarted","Data":"5d17b79f6f0ca0e20ff28597c2cb8e5c9c3fa37de23b92f7cc706f85b5d8f93c"} Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.821132 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d99d85789-zsrm7" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.821314 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d99d85789-zsrm7" event={"ID":"3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260","Type":"ContainerDied","Data":"a05c1a983adf9eb5dae115c0763be198847408565dcd18eebc362f90f3d4849f"} Feb 17 17:03:22 crc kubenswrapper[4694]: E0217 17:03:22.822195 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-c88gm" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.856365 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.865230 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-894447f7f-w5grh"] Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.920535 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83185784-bd77-41d3-a0da-28fa4fabf086" path="/var/lib/kubelet/pods/83185784-bd77-41d3-a0da-28fa4fabf086/volumes" Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.921019 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.921947 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dfc8b5bfc-j4v8j"] Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.972235 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:03:22 crc kubenswrapper[4694]: I0217 17:03:22.985747 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d99d85789-zsrm7"] Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.472433 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.473112 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8vqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-x8w9s_openstack(a3ff074b-45bc-4b82-89cf-b42f4b5991e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.474761 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-x8w9s" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.551496 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.551986 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ec1579-807c-4af0-8332-9a52733beed0" containerName="neutron-db-sync" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.552010 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ec1579-807c-4af0-8332-9a52733beed0" containerName="neutron-db-sync" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.552245 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ec1579-807c-4af0-8332-9a52733beed0" containerName="neutron-db-sync" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.553521 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.571242 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.595750 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657293 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7t2\" (UniqueName: \"kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2\") pod \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657415 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc\") pod \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657434 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb\") pod \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657516 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config\") pod \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657569 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb\") pod \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\" (UID: \"6bc3eb44-a604-462b-8a1d-9bb52e39a18b\") " Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657842 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657872 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657900 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfq4n\" (UniqueName: \"kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657933 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657958 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.657989 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.678273 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2" (OuterVolumeSpecName: "kube-api-access-kt7t2") pod "6bc3eb44-a604-462b-8a1d-9bb52e39a18b" (UID: "6bc3eb44-a604-462b-8a1d-9bb52e39a18b"). InnerVolumeSpecName "kube-api-access-kt7t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.680316 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.680652 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="init" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.680664 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="init" Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.680689 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.680696 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.680863 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" containerName="dnsmasq-dns" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.681681 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.688810 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.696068 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.696299 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.696433 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ngrvm" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.696569 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.717331 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bc3eb44-a604-462b-8a1d-9bb52e39a18b" (UID: "6bc3eb44-a604-462b-8a1d-9bb52e39a18b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.740642 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6bc3eb44-a604-462b-8a1d-9bb52e39a18b" (UID: "6bc3eb44-a604-462b-8a1d-9bb52e39a18b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760538 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760585 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760680 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfq4n\" (UniqueName: \"kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760711 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760732 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760762 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760845 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7t2\" (UniqueName: \"kubernetes.io/projected/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-kube-api-access-kt7t2\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760862 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.760873 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.761646 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.765184 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.765761 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.766060 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.771071 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.787500 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfq4n\" (UniqueName: \"kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n\") pod \"dnsmasq-dns-5ccc5c4795-s9jhl\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.799183 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6bc3eb44-a604-462b-8a1d-9bb52e39a18b" (UID: "6bc3eb44-a604-462b-8a1d-9bb52e39a18b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.825468 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config" (OuterVolumeSpecName: "config") pod "6bc3eb44-a604-462b-8a1d-9bb52e39a18b" (UID: "6bc3eb44-a604-462b-8a1d-9bb52e39a18b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.833813 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.834255 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-fvgsp" event={"ID":"6bc3eb44-a604-462b-8a1d-9bb52e39a18b","Type":"ContainerDied","Data":"b003ecd5084315d3280d458884c84d8da0943133b450c3b76fc702fc35539aa5"} Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.834312 4694 scope.go:117] "RemoveContainer" containerID="eacc3fdd7029d84917ca9abe1410eeb4a035073b8fb503a5e13127a7402d05fd" Feb 17 17:03:23 crc kubenswrapper[4694]: E0217 17:03:23.860213 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-x8w9s" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.862710 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh69k\" (UniqueName: \"kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.862787 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.862901 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.862933 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.862976 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.863078 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.863090 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6bc3eb44-a604-462b-8a1d-9bb52e39a18b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.903588 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.915525 4694 scope.go:117] "RemoveContainer" containerID="27c874d79f7fba7f03ba50525595042c3b87ec5d6b0d41f80e8d8082746a69fe" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.919931 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-fvgsp"] Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.958024 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.964460 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.964518 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.964598 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.964687 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh69k\" (UniqueName: \"kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.964728 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.970677 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.971987 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.975249 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:23 crc kubenswrapper[4694]: I0217 17:03:23.975711 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.001876 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh69k\" (UniqueName: \"kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k\") pod \"neutron-7c4845f94d-rb96f\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.022785 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hhvxg"] Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.025292 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.103633 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.168495 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.640194 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.828100 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.852059 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerStarted","Data":"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.852101 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerStarted","Data":"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.854683 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-swggc" event={"ID":"8994bf6c-4617-4837-a8a2-4d399f187abb","Type":"ContainerStarted","Data":"40ee26cc8680cea303b2f2855f47dfe1afcf69bbc87b19279d146731379268e3"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.861258 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f4f9856-rcwl9" event={"ID":"17711b82-3f49-41da-b17d-785c70869492","Type":"ContainerStarted","Data":"4ac8ad0a384210ef1cb360ae01861b3913dbc6723a4e833f2e600bf7b5e8e7ed"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.861313 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8f4f9856-rcwl9" event={"ID":"17711b82-3f49-41da-b17d-785c70869492","Type":"ContainerStarted","Data":"8851384e676192463f02db515a5f913626942ace79b5457030b26d3dbb439e44"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.863671 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerStarted","Data":"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.873487 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerStarted","Data":"450afac0a7ba73c8ff738a817d80e92f425f23ba7d452583562ba78ddb84797f"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.881651 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hhvxg" event={"ID":"a56edac5-8790-4475-ac42-c958ef4e523a","Type":"ContainerStarted","Data":"efb4b8ceef872668e6b83e8412c1d3fd7f9b6b53acf6ff5521691ea2549711cf"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.881731 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hhvxg" event={"ID":"a56edac5-8790-4475-ac42-c958ef4e523a","Type":"ContainerStarted","Data":"c574a9c7e6fa3e39098ecbc2da2c70e5171b5c6ad9829d7fd519a623ca9edd97"} Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.884533 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-757dbcd46d-pw2kl" podStartSLOduration=6.044614614 podStartE2EDuration="26.884518409s" podCreationTimestamp="2026-02-17 17:02:58 +0000 UTC" firstStartedPulling="2026-02-17 17:03:02.861893741 +0000 UTC m=+1250.618969065" lastFinishedPulling="2026-02-17 17:03:23.701797536 +0000 UTC m=+1271.458872860" observedRunningTime="2026-02-17 17:03:24.87114405 +0000 UTC m=+1272.628219384" watchObservedRunningTime="2026-02-17 17:03:24.884518409 +0000 UTC m=+1272.641593743" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.894433 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b8f4f9856-rcwl9" podStartSLOduration=2.353536304 podStartE2EDuration="25.894412313s" podCreationTimestamp="2026-02-17 17:02:59 +0000 UTC" firstStartedPulling="2026-02-17 17:03:00.106505208 +0000 UTC m=+1247.863580532" lastFinishedPulling="2026-02-17 17:03:23.647381217 +0000 UTC m=+1271.404456541" observedRunningTime="2026-02-17 17:03:24.890121697 +0000 UTC m=+1272.647197031" watchObservedRunningTime="2026-02-17 17:03:24.894412313 +0000 UTC m=+1272.651487637" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.911680 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-swggc" podStartSLOduration=3.267884975 podStartE2EDuration="34.911660047s" podCreationTimestamp="2026-02-17 17:02:50 +0000 UTC" firstStartedPulling="2026-02-17 17:02:51.738379458 +0000 UTC m=+1239.495454782" lastFinishedPulling="2026-02-17 17:03:23.38215453 +0000 UTC m=+1271.139229854" observedRunningTime="2026-02-17 17:03:24.906422728 +0000 UTC m=+1272.663498072" watchObservedRunningTime="2026-02-17 17:03:24.911660047 +0000 UTC m=+1272.668735371" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.913475 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260" path="/var/lib/kubelet/pods/3e4f3c9b-0a3f-47e5-80d0-ba2a452ff260/volumes" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.914009 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc3eb44-a604-462b-8a1d-9bb52e39a18b" path="/var/lib/kubelet/pods/6bc3eb44-a604-462b-8a1d-9bb52e39a18b/volumes" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.914923 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a" path="/var/lib/kubelet/pods/8b9e288a-536f-4a0b-8cd4-cbe2d6ce619a/volumes" Feb 17 17:03:24 crc kubenswrapper[4694]: I0217 17:03:24.926979 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hhvxg" podStartSLOduration=20.926959124 podStartE2EDuration="20.926959124s" podCreationTimestamp="2026-02-17 17:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:24.924462822 +0000 UTC m=+1272.681538136" watchObservedRunningTime="2026-02-17 17:03:24.926959124 +0000 UTC m=+1272.684034448" Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.909775 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerStarted","Data":"b6a670dde438e732d228b307c640eff615c4816e3202510a5f5890905440fa27"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.920870 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerStarted","Data":"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.920915 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerStarted","Data":"8ae8d6fde4f8c3f9d17f4de8dca5efeaebceddca60f90136d69fd45ce11b4802"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.930506 4694 generic.go:334] "Generic (PLEG): container finished" podID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerID="f0092ae2c5acf0cff1d4b5ff7bb31cbc70fe22f43561100e07e72fe583075853" exitCode=0 Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.930569 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" event={"ID":"4962ca9a-0d86-4074-b50e-14ded17f8c4d","Type":"ContainerDied","Data":"f0092ae2c5acf0cff1d4b5ff7bb31cbc70fe22f43561100e07e72fe583075853"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.930595 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" event={"ID":"4962ca9a-0d86-4074-b50e-14ded17f8c4d","Type":"ContainerStarted","Data":"11ef69e645d971e34fce061c50c0a2321a1a56c642c2f0317129c05432af3d8e"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.936641 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerStarted","Data":"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842"} Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.966940 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.974126 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.982086 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.982275 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 17:03:25 crc kubenswrapper[4694]: I0217 17:03:25.994839 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135303 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135665 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmmcb\" (UniqueName: \"kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135711 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135744 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135774 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135813 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.135859 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237659 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237710 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237746 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237792 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237836 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237866 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmmcb\" (UniqueName: \"kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.237903 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.244269 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.245176 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.245731 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.245792 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.246573 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.248442 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.264024 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmmcb\" (UniqueName: \"kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb\") pod \"neutron-6648f68957-f2dks\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.325589 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.877963 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.971386 4694 generic.go:334] "Generic (PLEG): container finished" podID="8994bf6c-4617-4837-a8a2-4d399f187abb" containerID="40ee26cc8680cea303b2f2855f47dfe1afcf69bbc87b19279d146731379268e3" exitCode=0 Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.973134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-swggc" event={"ID":"8994bf6c-4617-4837-a8a2-4d399f187abb","Type":"ContainerDied","Data":"40ee26cc8680cea303b2f2855f47dfe1afcf69bbc87b19279d146731379268e3"} Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.978970 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerStarted","Data":"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1"} Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.985017 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" event={"ID":"4962ca9a-0d86-4074-b50e-14ded17f8c4d","Type":"ContainerStarted","Data":"4756bc878e03dc78aa60383f13e47ebd8d18069642910440443aa32fb9c22a94"} Feb 17 17:03:26 crc kubenswrapper[4694]: I0217 17:03:26.985529 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.009000 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerStarted","Data":"d90be3809255cce81da354e9420e3941ebce242a7ce9c97daee795dcf4519d76"} Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.019731 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=27.019713242 podStartE2EDuration="27.019713242s" podCreationTimestamp="2026-02-17 17:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:27.014902474 +0000 UTC m=+1274.771977798" watchObservedRunningTime="2026-02-17 17:03:27.019713242 +0000 UTC m=+1274.776788566" Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.026898 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerStarted","Data":"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe"} Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.031492 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerStarted","Data":"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988"} Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.032159 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.068568 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=27.068552164 podStartE2EDuration="27.068552164s" podCreationTimestamp="2026-02-17 17:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:27.063411217 +0000 UTC m=+1274.820486541" watchObservedRunningTime="2026-02-17 17:03:27.068552164 +0000 UTC m=+1274.825627488" Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.070113 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" podStartSLOduration=4.070106392 podStartE2EDuration="4.070106392s" podCreationTimestamp="2026-02-17 17:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:27.044910752 +0000 UTC m=+1274.801986076" watchObservedRunningTime="2026-02-17 17:03:27.070106392 +0000 UTC m=+1274.827181716" Feb 17 17:03:27 crc kubenswrapper[4694]: I0217 17:03:27.091314 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c4845f94d-rb96f" podStartSLOduration=4.091295644 podStartE2EDuration="4.091295644s" podCreationTimestamp="2026-02-17 17:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:27.082450256 +0000 UTC m=+1274.839525580" watchObservedRunningTime="2026-02-17 17:03:27.091295644 +0000 UTC m=+1274.848370968" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.405410 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-swggc" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.487117 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle\") pod \"8994bf6c-4617-4837-a8a2-4d399f187abb\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.487561 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs\") pod \"8994bf6c-4617-4837-a8a2-4d399f187abb\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.487976 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts\") pod \"8994bf6c-4617-4837-a8a2-4d399f187abb\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.488014 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ftg9\" (UniqueName: \"kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9\") pod \"8994bf6c-4617-4837-a8a2-4d399f187abb\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.488041 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data\") pod \"8994bf6c-4617-4837-a8a2-4d399f187abb\" (UID: \"8994bf6c-4617-4837-a8a2-4d399f187abb\") " Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.488244 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs" (OuterVolumeSpecName: "logs") pod "8994bf6c-4617-4837-a8a2-4d399f187abb" (UID: "8994bf6c-4617-4837-a8a2-4d399f187abb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.488803 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8994bf6c-4617-4837-a8a2-4d399f187abb-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.505570 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts" (OuterVolumeSpecName: "scripts") pod "8994bf6c-4617-4837-a8a2-4d399f187abb" (UID: "8994bf6c-4617-4837-a8a2-4d399f187abb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.513347 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9" (OuterVolumeSpecName: "kube-api-access-5ftg9") pod "8994bf6c-4617-4837-a8a2-4d399f187abb" (UID: "8994bf6c-4617-4837-a8a2-4d399f187abb"). InnerVolumeSpecName "kube-api-access-5ftg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.516255 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data" (OuterVolumeSpecName: "config-data") pod "8994bf6c-4617-4837-a8a2-4d399f187abb" (UID: "8994bf6c-4617-4837-a8a2-4d399f187abb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.524575 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8994bf6c-4617-4837-a8a2-4d399f187abb" (UID: "8994bf6c-4617-4837-a8a2-4d399f187abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.590376 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.590419 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ftg9\" (UniqueName: \"kubernetes.io/projected/8994bf6c-4617-4837-a8a2-4d399f187abb-kube-api-access-5ftg9\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.590433 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:28 crc kubenswrapper[4694]: I0217 17:03:28.590446 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8994bf6c-4617-4837-a8a2-4d399f187abb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.061291 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-swggc" event={"ID":"8994bf6c-4617-4837-a8a2-4d399f187abb","Type":"ContainerDied","Data":"b72887e6687200e38dcb9bbc1ac1c5b7b1ffbd584e0a4f3c904dcb4cd054f061"} Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.061362 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b72887e6687200e38dcb9bbc1ac1c5b7b1ffbd584e0a4f3c904dcb4cd054f061" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.061306 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-swggc" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.063328 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerStarted","Data":"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf"} Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.111515 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:03:29 crc kubenswrapper[4694]: E0217 17:03:29.111914 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8994bf6c-4617-4837-a8a2-4d399f187abb" containerName="placement-db-sync" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.111930 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8994bf6c-4617-4837-a8a2-4d399f187abb" containerName="placement-db-sync" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.112131 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8994bf6c-4617-4837-a8a2-4d399f187abb" containerName="placement-db-sync" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.113029 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.116079 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.116213 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.116296 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wmtwl" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.116377 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.116456 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.123511 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.203885 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.203941 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.203962 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.203985 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.204124 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.204179 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.204199 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmdr\" (UniqueName: \"kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305703 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305766 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305794 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmdr\" (UniqueName: \"kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305862 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305884 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305904 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.305973 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.306433 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.310520 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.312172 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.312381 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.320626 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.324663 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmdr\" (UniqueName: \"kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.337473 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data\") pod \"placement-59959cfcd4-nr2rp\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.421920 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.421983 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.427746 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.471047 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:03:29 crc kubenswrapper[4694]: I0217 17:03:29.471126 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:03:30 crc kubenswrapper[4694]: I0217 17:03:30.110516 4694 generic.go:334] "Generic (PLEG): container finished" podID="a56edac5-8790-4475-ac42-c958ef4e523a" containerID="efb4b8ceef872668e6b83e8412c1d3fd7f9b6b53acf6ff5521691ea2549711cf" exitCode=0 Feb 17 17:03:30 crc kubenswrapper[4694]: I0217 17:03:30.110561 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hhvxg" event={"ID":"a56edac5-8790-4475-ac42-c958ef4e523a","Type":"ContainerDied","Data":"efb4b8ceef872668e6b83e8412c1d3fd7f9b6b53acf6ff5521691ea2549711cf"} Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.023549 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.023917 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.023932 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.023942 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.063647 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.075105 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.184189 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.185112 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.185134 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.185691 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.237192 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 17:03:31 crc kubenswrapper[4694]: I0217 17:03:31.264522 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.118250 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.119345 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.402113 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.590866 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.591352 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.591447 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.591469 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfnp8\" (UniqueName: \"kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.591494 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.591566 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys\") pod \"a56edac5-8790-4475-ac42-c958ef4e523a\" (UID: \"a56edac5-8790-4475-ac42-c958ef4e523a\") " Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.596697 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.599749 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8" (OuterVolumeSpecName: "kube-api-access-gfnp8") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "kube-api-access-gfnp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.599772 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts" (OuterVolumeSpecName: "scripts") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.599843 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.618329 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data" (OuterVolumeSpecName: "config-data") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.623479 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a56edac5-8790-4475-ac42-c958ef4e523a" (UID: "a56edac5-8790-4475-ac42-c958ef4e523a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693304 4694 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693338 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693348 4694 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693358 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693366 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfnp8\" (UniqueName: \"kubernetes.io/projected/a56edac5-8790-4475-ac42-c958ef4e523a-kube-api-access-gfnp8\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.693378 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56edac5-8790-4475-ac42-c958ef4e523a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.741586 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:03:33 crc kubenswrapper[4694]: W0217 17:03:33.747276 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c86e79_4506_4f20_83e3_1e7e85c07c80.slice/crio-c95e5374e43ec984f9bc4f93240403d36834e18ce5da4bdb616dd26eeeefab16 WatchSource:0}: Error finding container c95e5374e43ec984f9bc4f93240403d36834e18ce5da4bdb616dd26eeeefab16: Status 404 returned error can't find the container with id c95e5374e43ec984f9bc4f93240403d36834e18ce5da4bdb616dd26eeeefab16 Feb 17 17:03:33 crc kubenswrapper[4694]: I0217 17:03:33.959823 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.063740 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.064296 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="dnsmasq-dns" containerID="cri-o://d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2" gracePeriod=10 Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.140729 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerStarted","Data":"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.140994 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerStarted","Data":"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.141008 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerStarted","Data":"c95e5374e43ec984f9bc4f93240403d36834e18ce5da4bdb616dd26eeeefab16"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.146449 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerStarted","Data":"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.146584 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.148453 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerStarted","Data":"73ac86b877959f435d4577b40edb10d3adca62fa8d949dc3bf0cfefb61dfd708"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.150850 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hhvxg" event={"ID":"a56edac5-8790-4475-ac42-c958ef4e523a","Type":"ContainerDied","Data":"c574a9c7e6fa3e39098ecbc2da2c70e5171b5c6ad9829d7fd519a623ca9edd97"} Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.150889 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c574a9c7e6fa3e39098ecbc2da2c70e5171b5c6ad9829d7fd519a623ca9edd97" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.151001 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hhvxg" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.186217 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6648f68957-f2dks" podStartSLOduration=9.186196093 podStartE2EDuration="9.186196093s" podCreationTimestamp="2026-02-17 17:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:34.171242515 +0000 UTC m=+1281.928317859" watchObservedRunningTime="2026-02-17 17:03:34.186196093 +0000 UTC m=+1281.943271417" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.548942 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.549051 4694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.563354 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8667649c99-28rzh"] Feb 17 17:03:34 crc kubenswrapper[4694]: E0217 17:03:34.563773 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56edac5-8790-4475-ac42-c958ef4e523a" containerName="keystone-bootstrap" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.563795 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56edac5-8790-4475-ac42-c958ef4e523a" containerName="keystone-bootstrap" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.563994 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56edac5-8790-4475-ac42-c958ef4e523a" containerName="keystone-bootstrap" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.564574 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.567668 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.568000 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.568184 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-z26nq" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.568904 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.572485 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.572669 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.581768 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8667649c99-28rzh"] Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636200 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-credential-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636267 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjnt\" (UniqueName: \"kubernetes.io/projected/b0aae110-6e5c-4f32-95d9-b4b3429ca622-kube-api-access-ffjnt\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636319 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-internal-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636383 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-fernet-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636417 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-config-data\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636463 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-scripts\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636506 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-public-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.636554 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-combined-ca-bundle\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.634672 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.744043 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-fernet-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.744114 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-config-data\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.744148 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-scripts\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.744199 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-public-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.744245 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-combined-ca-bundle\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.745136 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-credential-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.745210 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjnt\" (UniqueName: \"kubernetes.io/projected/b0aae110-6e5c-4f32-95d9-b4b3429ca622-kube-api-access-ffjnt\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.745295 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-internal-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.749819 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.752329 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-combined-ca-bundle\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.752961 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-fernet-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.753519 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-scripts\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.754384 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-public-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.754592 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-credential-keys\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.757193 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-internal-tls-certs\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.766585 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0aae110-6e5c-4f32-95d9-b4b3429ca622-config-data\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.797060 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjnt\" (UniqueName: \"kubernetes.io/projected/b0aae110-6e5c-4f32-95d9-b4b3429ca622-kube-api-access-ffjnt\") pod \"keystone-8667649c99-28rzh\" (UID: \"b0aae110-6e5c-4f32-95d9-b4b3429ca622\") " pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847476 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847554 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847620 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847664 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rzpn\" (UniqueName: \"kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847690 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.847803 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc\") pod \"06daa413-9d80-4e00-a276-d84c2e15a56f\" (UID: \"06daa413-9d80-4e00-a276-d84c2e15a56f\") " Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.856214 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn" (OuterVolumeSpecName: "kube-api-access-6rzpn") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "kube-api-access-6rzpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.916417 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.916830 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.930817 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.945270 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config" (OuterVolumeSpecName: "config") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953195 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953244 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953307 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953324 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rzpn\" (UniqueName: \"kubernetes.io/projected/06daa413-9d80-4e00-a276-d84c2e15a56f-kube-api-access-6rzpn\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953336 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:34 crc kubenswrapper[4694]: I0217 17:03:34.953990 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06daa413-9d80-4e00-a276-d84c2e15a56f" (UID: "06daa413-9d80-4e00-a276-d84c2e15a56f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.013063 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.057123 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06daa413-9d80-4e00-a276-d84c2e15a56f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.251981 4694 generic.go:334] "Generic (PLEG): container finished" podID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerID="d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2" exitCode=0 Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.253939 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.257504 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" event={"ID":"06daa413-9d80-4e00-a276-d84c2e15a56f","Type":"ContainerDied","Data":"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2"} Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.257544 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-jlvk2" event={"ID":"06daa413-9d80-4e00-a276-d84c2e15a56f","Type":"ContainerDied","Data":"f9b92ee1b351ec300e8a7962102dda9f1b78f18689a677253479011157123725"} Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.257560 4694 scope.go:117] "RemoveContainer" containerID="d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.257849 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.257876 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.332398 4694 scope.go:117] "RemoveContainer" containerID="1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.337809 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59959cfcd4-nr2rp" podStartSLOduration=6.33778377 podStartE2EDuration="6.33778377s" podCreationTimestamp="2026-02-17 17:03:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:35.291249915 +0000 UTC m=+1283.048325239" watchObservedRunningTime="2026-02-17 17:03:35.33778377 +0000 UTC m=+1283.094859094" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.348664 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.355652 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-jlvk2"] Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.380123 4694 scope.go:117] "RemoveContainer" containerID="d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2" Feb 17 17:03:35 crc kubenswrapper[4694]: E0217 17:03:35.383704 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2\": container with ID starting with d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2 not found: ID does not exist" containerID="d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.383746 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2"} err="failed to get container status \"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2\": rpc error: code = NotFound desc = could not find container \"d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2\": container with ID starting with d3fe8b3bf3244217594ebb5cb4046fa8a04bbdaac250e73ff81a6b0c0670d6d2 not found: ID does not exist" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.383769 4694 scope.go:117] "RemoveContainer" containerID="1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd" Feb 17 17:03:35 crc kubenswrapper[4694]: E0217 17:03:35.384273 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd\": container with ID starting with 1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd not found: ID does not exist" containerID="1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.384313 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd"} err="failed to get container status \"1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd\": rpc error: code = NotFound desc = could not find container \"1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd\": container with ID starting with 1b24a72461ae80e625eba578863f85ea3254ad0d3687d87ab6279e802a12c4fd not found: ID does not exist" Feb 17 17:03:35 crc kubenswrapper[4694]: I0217 17:03:35.666023 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8667649c99-28rzh"] Feb 17 17:03:35 crc kubenswrapper[4694]: W0217 17:03:35.681326 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0aae110_6e5c_4f32_95d9_b4b3429ca622.slice/crio-c5e674ef223bc977b50ad54a4a991ed2100558ad0ccb55d91dc90cf264cf2277 WatchSource:0}: Error finding container c5e674ef223bc977b50ad54a4a991ed2100558ad0ccb55d91dc90cf264cf2277: Status 404 returned error can't find the container with id c5e674ef223bc977b50ad54a4a991ed2100558ad0ccb55d91dc90cf264cf2277 Feb 17 17:03:36 crc kubenswrapper[4694]: I0217 17:03:36.270150 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8667649c99-28rzh" event={"ID":"b0aae110-6e5c-4f32-95d9-b4b3429ca622","Type":"ContainerStarted","Data":"d8785a7a0766f7fcfea7a7dc903bdb752a1d7bae98f234e873db37223c348979"} Feb 17 17:03:36 crc kubenswrapper[4694]: I0217 17:03:36.270551 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8667649c99-28rzh" event={"ID":"b0aae110-6e5c-4f32-95d9-b4b3429ca622","Type":"ContainerStarted","Data":"c5e674ef223bc977b50ad54a4a991ed2100558ad0ccb55d91dc90cf264cf2277"} Feb 17 17:03:36 crc kubenswrapper[4694]: I0217 17:03:36.292407 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8667649c99-28rzh" podStartSLOduration=2.292390701 podStartE2EDuration="2.292390701s" podCreationTimestamp="2026-02-17 17:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:36.28747837 +0000 UTC m=+1284.044553694" watchObservedRunningTime="2026-02-17 17:03:36.292390701 +0000 UTC m=+1284.049466025" Feb 17 17:03:36 crc kubenswrapper[4694]: I0217 17:03:36.906631 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" path="/var/lib/kubelet/pods/06daa413-9d80-4e00-a276-d84c2e15a56f/volumes" Feb 17 17:03:37 crc kubenswrapper[4694]: I0217 17:03:37.279499 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c88gm" event={"ID":"aaac0bee-f5f9-49c0-b880-6c57d412972e","Type":"ContainerStarted","Data":"c90a18508957bd152efd082f8496d131de7ab9c842d5dc91741236c1efc161a4"} Feb 17 17:03:37 crc kubenswrapper[4694]: I0217 17:03:37.279709 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:03:37 crc kubenswrapper[4694]: I0217 17:03:37.303710 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-c88gm" podStartSLOduration=5.710304748 podStartE2EDuration="47.303664587s" podCreationTimestamp="2026-02-17 17:02:50 +0000 UTC" firstStartedPulling="2026-02-17 17:02:54.724690994 +0000 UTC m=+1242.481766318" lastFinishedPulling="2026-02-17 17:03:36.318050833 +0000 UTC m=+1284.075126157" observedRunningTime="2026-02-17 17:03:37.292297877 +0000 UTC m=+1285.049373191" watchObservedRunningTime="2026-02-17 17:03:37.303664587 +0000 UTC m=+1285.060739911" Feb 17 17:03:39 crc kubenswrapper[4694]: I0217 17:03:39.298003 4694 generic.go:334] "Generic (PLEG): container finished" podID="aaac0bee-f5f9-49c0-b880-6c57d412972e" containerID="c90a18508957bd152efd082f8496d131de7ab9c842d5dc91741236c1efc161a4" exitCode=0 Feb 17 17:03:39 crc kubenswrapper[4694]: I0217 17:03:39.298385 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c88gm" event={"ID":"aaac0bee-f5f9-49c0-b880-6c57d412972e","Type":"ContainerDied","Data":"c90a18508957bd152efd082f8496d131de7ab9c842d5dc91741236c1efc161a4"} Feb 17 17:03:39 crc kubenswrapper[4694]: I0217 17:03:39.425227 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 17:03:39 crc kubenswrapper[4694]: I0217 17:03:39.473290 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b8f4f9856-rcwl9" podUID="17711b82-3f49-41da-b17d-785c70869492" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.167268 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c88gm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.271868 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle\") pod \"aaac0bee-f5f9-49c0-b880-6c57d412972e\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.272011 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data\") pod \"aaac0bee-f5f9-49c0-b880-6c57d412972e\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.272186 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2vwg\" (UniqueName: \"kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg\") pod \"aaac0bee-f5f9-49c0-b880-6c57d412972e\" (UID: \"aaac0bee-f5f9-49c0-b880-6c57d412972e\") " Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.280433 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aaac0bee-f5f9-49c0-b880-6c57d412972e" (UID: "aaac0bee-f5f9-49c0-b880-6c57d412972e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.281943 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg" (OuterVolumeSpecName: "kube-api-access-n2vwg") pod "aaac0bee-f5f9-49c0-b880-6c57d412972e" (UID: "aaac0bee-f5f9-49c0-b880-6c57d412972e"). InnerVolumeSpecName "kube-api-access-n2vwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.299466 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaac0bee-f5f9-49c0-b880-6c57d412972e" (UID: "aaac0bee-f5f9-49c0-b880-6c57d412972e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.317802 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c88gm" event={"ID":"aaac0bee-f5f9-49c0-b880-6c57d412972e","Type":"ContainerDied","Data":"13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e"} Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.317862 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c88gm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.317888 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ba9b3f5b6c455e80d496ee3349257da5ffc245fcb98723693507d32a4c064e" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.373702 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2vwg\" (UniqueName: \"kubernetes.io/projected/aaac0bee-f5f9-49c0-b880-6c57d412972e-kube-api-access-n2vwg\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.373733 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.373742 4694 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aaac0bee-f5f9-49c0-b880-6c57d412972e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.595966 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b65b64c9-dlmdm"] Feb 17 17:03:41 crc kubenswrapper[4694]: E0217 17:03:41.597041 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="init" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.597068 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="init" Feb 17 17:03:41 crc kubenswrapper[4694]: E0217 17:03:41.597090 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" containerName="barbican-db-sync" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.597100 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" containerName="barbican-db-sync" Feb 17 17:03:41 crc kubenswrapper[4694]: E0217 17:03:41.597151 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="dnsmasq-dns" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.597160 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="dnsmasq-dns" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.597365 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" containerName="barbican-db-sync" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.597408 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="06daa413-9d80-4e00-a276-d84c2e15a56f" containerName="dnsmasq-dns" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.598520 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.602966 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.603042 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k6rt9" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.603214 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.615794 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b65b64c9-dlmdm"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.663455 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.664998 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.685949 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-694d8d5c8-nq2bp"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.688711 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.693931 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.694683 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-694d8d5c8-nq2bp"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.701323 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data-custom\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.701394 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0470d53c-a76c-4cf3-8f95-1ae293182645-logs\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.701420 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhdt\" (UniqueName: \"kubernetes.io/projected/0470d53c-a76c-4cf3-8f95-1ae293182645-kube-api-access-khhdt\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.701467 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-combined-ca-bundle\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.701487 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.704767 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.769060 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.770431 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.775112 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.791534 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.804847 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.804910 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data-custom\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.804939 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s7st\" (UniqueName: \"kubernetes.io/projected/d79bfe1a-e161-41b0-8eed-0f1879b1f990-kube-api-access-7s7st\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.804958 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data-custom\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.804990 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0470d53c-a76c-4cf3-8f95-1ae293182645-logs\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805008 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khhdt\" (UniqueName: \"kubernetes.io/projected/0470d53c-a76c-4cf3-8f95-1ae293182645-kube-api-access-khhdt\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805027 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805047 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805080 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805099 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-combined-ca-bundle\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805118 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805152 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805178 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805201 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-combined-ca-bundle\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805217 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bfe1a-e161-41b0-8eed-0f1879b1f990-logs\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.805248 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngszq\" (UniqueName: \"kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.806896 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0470d53c-a76c-4cf3-8f95-1ae293182645-logs\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.811217 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-combined-ca-bundle\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.811625 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.814098 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0470d53c-a76c-4cf3-8f95-1ae293182645-config-data-custom\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.822023 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhdt\" (UniqueName: \"kubernetes.io/projected/0470d53c-a76c-4cf3-8f95-1ae293182645-kube-api-access-khhdt\") pod \"barbican-worker-6b65b64c9-dlmdm\" (UID: \"0470d53c-a76c-4cf3-8f95-1ae293182645\") " pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: E0217 17:03:41.859254 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.907241 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.907278 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.907310 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.907329 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910555 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910621 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910690 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-combined-ca-bundle\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910709 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bfe1a-e161-41b0-8eed-0f1879b1f990-logs\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910774 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngszq\" (UniqueName: \"kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910794 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910809 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szkz\" (UniqueName: \"kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910854 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910885 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.910930 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.911013 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s7st\" (UniqueName: \"kubernetes.io/projected/d79bfe1a-e161-41b0-8eed-0f1879b1f990-kube-api-access-7s7st\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.911034 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data-custom\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.911851 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.912408 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.913056 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.914424 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data-custom\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.914437 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.915008 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.915302 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d79bfe1a-e161-41b0-8eed-0f1879b1f990-logs\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.918838 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-combined-ca-bundle\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.927754 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bfe1a-e161-41b0-8eed-0f1879b1f990-config-data\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.932418 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b65b64c9-dlmdm" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.934970 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngszq\" (UniqueName: \"kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq\") pod \"dnsmasq-dns-688c87cc99-2f59b\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:41 crc kubenswrapper[4694]: I0217 17:03:41.937663 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s7st\" (UniqueName: \"kubernetes.io/projected/d79bfe1a-e161-41b0-8eed-0f1879b1f990-kube-api-access-7s7st\") pod \"barbican-keystone-listener-694d8d5c8-nq2bp\" (UID: \"d79bfe1a-e161-41b0-8eed-0f1879b1f990\") " pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.005574 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.011674 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.011802 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.011874 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.011891 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szkz\" (UniqueName: \"kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.011911 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.014798 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.027896 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.027989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.028284 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.038978 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szkz\" (UniqueName: \"kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz\") pod \"barbican-api-b7f697b84-ththl\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.063802 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.099189 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.343140 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerStarted","Data":"73aea5c7d9bbb6b4eadd09cce29c9df796c542bdb2c21f07a89e1057f0fd9b2b"} Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.343517 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="ceilometer-notification-agent" containerID="cri-o://b6a670dde438e732d228b307c640eff615c4816e3202510a5f5890905440fa27" gracePeriod=30 Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.343760 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.343788 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="proxy-httpd" containerID="cri-o://73aea5c7d9bbb6b4eadd09cce29c9df796c542bdb2c21f07a89e1057f0fd9b2b" gracePeriod=30 Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.343905 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="sg-core" containerID="cri-o://73ac86b877959f435d4577b40edb10d3adca62fa8d949dc3bf0cfefb61dfd708" gracePeriod=30 Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.435499 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b65b64c9-dlmdm"] Feb 17 17:03:42 crc kubenswrapper[4694]: W0217 17:03:42.443987 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0470d53c_a76c_4cf3_8f95_1ae293182645.slice/crio-4caf8235dc7346ba2ea1ad4d0dc452d73a8b478dad9ca3ace7afe47e3257842b WatchSource:0}: Error finding container 4caf8235dc7346ba2ea1ad4d0dc452d73a8b478dad9ca3ace7afe47e3257842b: Status 404 returned error can't find the container with id 4caf8235dc7346ba2ea1ad4d0dc452d73a8b478dad9ca3ace7afe47e3257842b Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.652497 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-694d8d5c8-nq2bp"] Feb 17 17:03:42 crc kubenswrapper[4694]: W0217 17:03:42.781997 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda450a10f_4986_4803_ac6c_0507e25ada5a.slice/crio-71089725c118b5da327ff0c1983521ba9db14af5156939c1c384483a3eb79b6b WatchSource:0}: Error finding container 71089725c118b5da327ff0c1983521ba9db14af5156939c1c384483a3eb79b6b: Status 404 returned error can't find the container with id 71089725c118b5da327ff0c1983521ba9db14af5156939c1c384483a3eb79b6b Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.783236 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:03:42 crc kubenswrapper[4694]: I0217 17:03:42.862146 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.352530 4694 generic.go:334] "Generic (PLEG): container finished" podID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerID="73aea5c7d9bbb6b4eadd09cce29c9df796c542bdb2c21f07a89e1057f0fd9b2b" exitCode=0 Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.352574 4694 generic.go:334] "Generic (PLEG): container finished" podID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerID="73ac86b877959f435d4577b40edb10d3adca62fa8d949dc3bf0cfefb61dfd708" exitCode=2 Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.352643 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerDied","Data":"73aea5c7d9bbb6b4eadd09cce29c9df796c542bdb2c21f07a89e1057f0fd9b2b"} Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.352687 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerDied","Data":"73ac86b877959f435d4577b40edb10d3adca62fa8d949dc3bf0cfefb61dfd708"} Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.354045 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b65b64c9-dlmdm" event={"ID":"0470d53c-a76c-4cf3-8f95-1ae293182645","Type":"ContainerStarted","Data":"4caf8235dc7346ba2ea1ad4d0dc452d73a8b478dad9ca3ace7afe47e3257842b"} Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.355033 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" event={"ID":"d79bfe1a-e161-41b0-8eed-0f1879b1f990","Type":"ContainerStarted","Data":"141ac2dcbbae87363bf8677eef077d59dd19cffac52a1e2311a71e456f363dbf"} Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.356240 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x8w9s" event={"ID":"a3ff074b-45bc-4b82-89cf-b42f4b5991e1","Type":"ContainerStarted","Data":"17852fe2d3511ee9cf884df541308d754b56783c29e45e3395cd35609c84b6b3"} Feb 17 17:03:43 crc kubenswrapper[4694]: I0217 17:03:43.357187 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerStarted","Data":"71089725c118b5da327ff0c1983521ba9db14af5156939c1c384483a3eb79b6b"} Feb 17 17:03:43 crc kubenswrapper[4694]: W0217 17:03:43.635061 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1313c9d0_5542_4147_a6ef_f66b09b571b4.slice/crio-1d9d2ba691c9eb4b5f432bef99a01d03b99fc48d70279d229c1d057c3073d2c7 WatchSource:0}: Error finding container 1d9d2ba691c9eb4b5f432bef99a01d03b99fc48d70279d229c1d057c3073d2c7: Status 404 returned error can't find the container with id 1d9d2ba691c9eb4b5f432bef99a01d03b99fc48d70279d229c1d057c3073d2c7 Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.355656 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cff554946-ddg9w"] Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.389017 4694 generic.go:334] "Generic (PLEG): container finished" podID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerID="0d7173364ec4fcc3cc7b3d23ad8763e5ecce8d80f47f57a89717bc9f4c6ec3d2" exitCode=0 Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402099 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402152 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" event={"ID":"1313c9d0-5542-4147-a6ef-f66b09b571b4","Type":"ContainerDied","Data":"0d7173364ec4fcc3cc7b3d23ad8763e5ecce8d80f47f57a89717bc9f4c6ec3d2"} Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402179 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" event={"ID":"1313c9d0-5542-4147-a6ef-f66b09b571b4","Type":"ContainerStarted","Data":"1d9d2ba691c9eb4b5f432bef99a01d03b99fc48d70279d229c1d057c3073d2c7"} Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402195 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cff554946-ddg9w"] Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402209 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerStarted","Data":"88dc13f979a48a93cf1803ec295d65fc40765248cb2d55852292f1a08bf10ffe"} Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402239 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerStarted","Data":"4936027ef37a6fcfbe3af74cc10dc10e7582fdbe64eee03eacd1d8fc28f9d67d"} Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402254 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.402369 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.407146 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.407390 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463117 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4hf\" (UniqueName: \"kubernetes.io/projected/1843aa1a-460a-42ec-adb2-b20b48c71a90-kube-api-access-2p4hf\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463367 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-combined-ca-bundle\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463395 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1843aa1a-460a-42ec-adb2-b20b48c71a90-logs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463449 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463474 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data-custom\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463495 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-public-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.463561 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-internal-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.495055 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b7f697b84-ththl" podStartSLOduration=3.4950349 podStartE2EDuration="3.4950349s" podCreationTimestamp="2026-02-17 17:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:44.477371715 +0000 UTC m=+1292.234447049" watchObservedRunningTime="2026-02-17 17:03:44.4950349 +0000 UTC m=+1292.252110224" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.519100 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x8w9s" podStartSLOduration=5.716855592 podStartE2EDuration="55.519084092s" podCreationTimestamp="2026-02-17 17:02:49 +0000 UTC" firstStartedPulling="2026-02-17 17:02:51.650922916 +0000 UTC m=+1239.407998240" lastFinishedPulling="2026-02-17 17:03:41.453151416 +0000 UTC m=+1289.210226740" observedRunningTime="2026-02-17 17:03:44.512186412 +0000 UTC m=+1292.269261736" watchObservedRunningTime="2026-02-17 17:03:44.519084092 +0000 UTC m=+1292.276159416" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565271 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-combined-ca-bundle\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565340 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1843aa1a-460a-42ec-adb2-b20b48c71a90-logs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565375 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565397 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data-custom\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565420 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-public-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565467 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-internal-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.565520 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4hf\" (UniqueName: \"kubernetes.io/projected/1843aa1a-460a-42ec-adb2-b20b48c71a90-kube-api-access-2p4hf\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.568351 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1843aa1a-460a-42ec-adb2-b20b48c71a90-logs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.582342 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-internal-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.582886 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-public-tls-certs\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.583036 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-combined-ca-bundle\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.584078 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data-custom\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.587053 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1843aa1a-460a-42ec-adb2-b20b48c71a90-config-data\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.590593 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p4hf\" (UniqueName: \"kubernetes.io/projected/1843aa1a-460a-42ec-adb2-b20b48c71a90-kube-api-access-2p4hf\") pod \"barbican-api-7cff554946-ddg9w\" (UID: \"1843aa1a-460a-42ec-adb2-b20b48c71a90\") " pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:44 crc kubenswrapper[4694]: I0217 17:03:44.733748 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.416436 4694 generic.go:334] "Generic (PLEG): container finished" podID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerID="b6a670dde438e732d228b307c640eff615c4816e3202510a5f5890905440fa27" exitCode=0 Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.417704 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerDied","Data":"b6a670dde438e732d228b307c640eff615c4816e3202510a5f5890905440fa27"} Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.629344 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786174 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z7tc\" (UniqueName: \"kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786220 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786259 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786328 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786350 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786369 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.786423 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml\") pod \"e23514ea-6a1f-433d-ab93-663bd65629d2\" (UID: \"e23514ea-6a1f-433d-ab93-663bd65629d2\") " Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.792016 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.792173 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.797240 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc" (OuterVolumeSpecName: "kube-api-access-9z7tc") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "kube-api-access-9z7tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.799541 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts" (OuterVolumeSpecName: "scripts") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.834801 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.890325 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895815 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z7tc\" (UniqueName: \"kubernetes.io/projected/e23514ea-6a1f-433d-ab93-663bd65629d2-kube-api-access-9z7tc\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895859 4694 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895869 4694 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e23514ea-6a1f-433d-ab93-663bd65629d2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895878 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895887 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.895900 4694 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:45 crc kubenswrapper[4694]: W0217 17:03:45.923511 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1843aa1a_460a_42ec_adb2_b20b48c71a90.slice/crio-34bd4220fe99fa20094e423dc26730bacc7fb251b9042bdaf1de44bce6d4cfc6 WatchSource:0}: Error finding container 34bd4220fe99fa20094e423dc26730bacc7fb251b9042bdaf1de44bce6d4cfc6: Status 404 returned error can't find the container with id 34bd4220fe99fa20094e423dc26730bacc7fb251b9042bdaf1de44bce6d4cfc6 Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.927562 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cff554946-ddg9w"] Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.948655 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data" (OuterVolumeSpecName: "config-data") pod "e23514ea-6a1f-433d-ab93-663bd65629d2" (UID: "e23514ea-6a1f-433d-ab93-663bd65629d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:45 crc kubenswrapper[4694]: I0217 17:03:45.997940 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23514ea-6a1f-433d-ab93-663bd65629d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.427700 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e23514ea-6a1f-433d-ab93-663bd65629d2","Type":"ContainerDied","Data":"4edb5aada38db9867c0022bfd4e86c7714541273160468e4ae13cc3ca67b880f"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.427744 4694 scope.go:117] "RemoveContainer" containerID="73aea5c7d9bbb6b4eadd09cce29c9df796c542bdb2c21f07a89e1057f0fd9b2b" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.427847 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.432697 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b65b64c9-dlmdm" event={"ID":"0470d53c-a76c-4cf3-8f95-1ae293182645","Type":"ContainerStarted","Data":"8fe2ab27574ef4ee3aea5e01ec2e9f6727fdeaa23da2b6e9d17b77b2fefe3270"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.432856 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b65b64c9-dlmdm" event={"ID":"0470d53c-a76c-4cf3-8f95-1ae293182645","Type":"ContainerStarted","Data":"b9297f0b2fae33373b67a74a6d939a356ad54697e4ceae7a518cf81c076924c6"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.436132 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" event={"ID":"d79bfe1a-e161-41b0-8eed-0f1879b1f990","Type":"ContainerStarted","Data":"79cd1fa54aebbd2133baedf6d721bdc5dbf1883350bc8d8603ad5812181e29fc"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.436265 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" event={"ID":"d79bfe1a-e161-41b0-8eed-0f1879b1f990","Type":"ContainerStarted","Data":"2d729a469527b899d94227f4b46c25c4fc75b99b5ec79313219c909829c514a8"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.439036 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cff554946-ddg9w" event={"ID":"1843aa1a-460a-42ec-adb2-b20b48c71a90","Type":"ContainerStarted","Data":"0fa2afa576c79cb9b8b40878e9ea7fec8523357ac0248cec47bfedafd9148314"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.439144 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cff554946-ddg9w" event={"ID":"1843aa1a-460a-42ec-adb2-b20b48c71a90","Type":"ContainerStarted","Data":"34bd4220fe99fa20094e423dc26730bacc7fb251b9042bdaf1de44bce6d4cfc6"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.448357 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" event={"ID":"1313c9d0-5542-4147-a6ef-f66b09b571b4","Type":"ContainerStarted","Data":"82c26206b84b05fd48d1d0d10a01d3c57348440596310d2768be3211a70c232b"} Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.448501 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.473790 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b65b64c9-dlmdm" podStartSLOduration=2.5817761150000003 podStartE2EDuration="5.473769272s" podCreationTimestamp="2026-02-17 17:03:41 +0000 UTC" firstStartedPulling="2026-02-17 17:03:42.445961766 +0000 UTC m=+1290.203037080" lastFinishedPulling="2026-02-17 17:03:45.337954923 +0000 UTC m=+1293.095030237" observedRunningTime="2026-02-17 17:03:46.461290305 +0000 UTC m=+1294.218365639" watchObservedRunningTime="2026-02-17 17:03:46.473769272 +0000 UTC m=+1294.230844606" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.496551 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-694d8d5c8-nq2bp" podStartSLOduration=2.8029038870000003 podStartE2EDuration="5.496516292s" podCreationTimestamp="2026-02-17 17:03:41 +0000 UTC" firstStartedPulling="2026-02-17 17:03:42.64522855 +0000 UTC m=+1290.402303884" lastFinishedPulling="2026-02-17 17:03:45.338840965 +0000 UTC m=+1293.095916289" observedRunningTime="2026-02-17 17:03:46.491276453 +0000 UTC m=+1294.248351777" watchObservedRunningTime="2026-02-17 17:03:46.496516292 +0000 UTC m=+1294.253591616" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.546815 4694 scope.go:117] "RemoveContainer" containerID="73ac86b877959f435d4577b40edb10d3adca62fa8d949dc3bf0cfefb61dfd708" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.559685 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.567484 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.573092 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" podStartSLOduration=5.573051135 podStartE2EDuration="5.573051135s" podCreationTimestamp="2026-02-17 17:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:46.546707217 +0000 UTC m=+1294.303782551" watchObservedRunningTime="2026-02-17 17:03:46.573051135 +0000 UTC m=+1294.330126459" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.604697 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:03:46 crc kubenswrapper[4694]: E0217 17:03:46.605133 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="proxy-httpd" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605145 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="proxy-httpd" Feb 17 17:03:46 crc kubenswrapper[4694]: E0217 17:03:46.605180 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="ceilometer-notification-agent" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605186 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="ceilometer-notification-agent" Feb 17 17:03:46 crc kubenswrapper[4694]: E0217 17:03:46.605194 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="sg-core" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605201 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="sg-core" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605361 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="proxy-httpd" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605373 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="ceilometer-notification-agent" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.605394 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" containerName="sg-core" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.607009 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.608193 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.610753 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.611054 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.645675 4694 scope.go:117] "RemoveContainer" containerID="b6a670dde438e732d228b307c640eff615c4816e3202510a5f5890905440fa27" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.716515 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.716594 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.716657 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.716872 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz77m\" (UniqueName: \"kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.717011 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.717089 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.717208 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.818894 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz77m\" (UniqueName: \"kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.818964 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.818993 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.819022 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.819047 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.819070 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.819086 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.819629 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.820845 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.826480 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.826899 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.827192 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.840319 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.860466 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz77m\" (UniqueName: \"kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m\") pod \"ceilometer-0\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " pod="openstack/ceilometer-0" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.920579 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23514ea-6a1f-433d-ab93-663bd65629d2" path="/var/lib/kubelet/pods/e23514ea-6a1f-433d-ab93-663bd65629d2/volumes" Feb 17 17:03:46 crc kubenswrapper[4694]: I0217 17:03:46.940049 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:03:47 crc kubenswrapper[4694]: I0217 17:03:47.420964 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:03:47 crc kubenswrapper[4694]: W0217 17:03:47.429292 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eacc3da_f0eb_4c87_a981_f1cc3f180e37.slice/crio-b57b4af1eb7b3d88bdbedf8a70780109c2f25a630a3996819fa085d4b987f36c WatchSource:0}: Error finding container b57b4af1eb7b3d88bdbedf8a70780109c2f25a630a3996819fa085d4b987f36c: Status 404 returned error can't find the container with id b57b4af1eb7b3d88bdbedf8a70780109c2f25a630a3996819fa085d4b987f36c Feb 17 17:03:47 crc kubenswrapper[4694]: I0217 17:03:47.461134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerStarted","Data":"b57b4af1eb7b3d88bdbedf8a70780109c2f25a630a3996819fa085d4b987f36c"} Feb 17 17:03:47 crc kubenswrapper[4694]: I0217 17:03:47.463254 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cff554946-ddg9w" event={"ID":"1843aa1a-460a-42ec-adb2-b20b48c71a90","Type":"ContainerStarted","Data":"ed5181f80bb54701170d2c6694eea4fc3360bfc6bfce2c7c42ba49fccfaf123e"} Feb 17 17:03:47 crc kubenswrapper[4694]: I0217 17:03:47.489601 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cff554946-ddg9w" podStartSLOduration=3.489580569 podStartE2EDuration="3.489580569s" podCreationTimestamp="2026-02-17 17:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:47.487060547 +0000 UTC m=+1295.244135871" watchObservedRunningTime="2026-02-17 17:03:47.489580569 +0000 UTC m=+1295.246655913" Feb 17 17:03:48 crc kubenswrapper[4694]: I0217 17:03:48.473484 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerStarted","Data":"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649"} Feb 17 17:03:48 crc kubenswrapper[4694]: I0217 17:03:48.475551 4694 generic.go:334] "Generic (PLEG): container finished" podID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" containerID="17852fe2d3511ee9cf884df541308d754b56783c29e45e3395cd35609c84b6b3" exitCode=0 Feb 17 17:03:48 crc kubenswrapper[4694]: I0217 17:03:48.475678 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x8w9s" event={"ID":"a3ff074b-45bc-4b82-89cf-b42f4b5991e1","Type":"ContainerDied","Data":"17852fe2d3511ee9cf884df541308d754b56783c29e45e3395cd35609c84b6b3"} Feb 17 17:03:48 crc kubenswrapper[4694]: I0217 17:03:48.475881 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:48 crc kubenswrapper[4694]: I0217 17:03:48.475921 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.488813 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerStarted","Data":"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd"} Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.896349 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999399 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999444 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999497 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999597 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999666 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8vqv\" (UniqueName: \"kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:49 crc kubenswrapper[4694]: I0217 17:03:49.999714 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data\") pod \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\" (UID: \"a3ff074b-45bc-4b82-89cf-b42f4b5991e1\") " Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.000970 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.006512 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv" (OuterVolumeSpecName: "kube-api-access-z8vqv") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "kube-api-access-z8vqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.006883 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts" (OuterVolumeSpecName: "scripts") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.007632 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.026836 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.049931 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data" (OuterVolumeSpecName: "config-data") pod "a3ff074b-45bc-4b82-89cf-b42f4b5991e1" (UID: "a3ff074b-45bc-4b82-89cf-b42f4b5991e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101536 4694 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101598 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8vqv\" (UniqueName: \"kubernetes.io/projected/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-kube-api-access-z8vqv\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101667 4694 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101679 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101690 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.101701 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3ff074b-45bc-4b82-89cf-b42f4b5991e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.501305 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerStarted","Data":"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480"} Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.503294 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x8w9s" event={"ID":"a3ff074b-45bc-4b82-89cf-b42f4b5991e1","Type":"ContainerDied","Data":"d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254"} Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.503321 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d05406835dcf7146affc6efa8ef6832352c63432823b06e6f2ad5075d99d3254" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.503378 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x8w9s" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.813074 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:03:50 crc kubenswrapper[4694]: E0217 17:03:50.813853 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" containerName="cinder-db-sync" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.813876 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" containerName="cinder-db-sync" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.814075 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" containerName="cinder-db-sync" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.816412 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.829793 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bmt67" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.829998 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.830204 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.830837 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.866434 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.916555 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.916736 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.916926 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55nsz\" (UniqueName: \"kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.916953 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.917051 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.917083 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.984680 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.984980 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="dnsmasq-dns" containerID="cri-o://82c26206b84b05fd48d1d0d10a01d3c57348440596310d2768be3211a70c232b" gracePeriod=10 Feb 17 17:03:50 crc kubenswrapper[4694]: I0217 17:03:50.994175 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037672 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55nsz\" (UniqueName: \"kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037709 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037847 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037886 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037940 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.037958 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.039819 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.045620 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.052298 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.053948 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.057960 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.058548 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.062005 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.080492 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.082211 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55nsz\" (UniqueName: \"kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz\") pod \"cinder-scheduler-0\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.172793 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.172861 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.178238 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.182518 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.214459 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265269 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265339 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265392 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbd4\" (UniqueName: \"kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265432 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvct\" (UniqueName: \"kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265456 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265491 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265529 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265572 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265590 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265645 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265670 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265697 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.265719 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.368755 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnvct\" (UniqueName: \"kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369140 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369189 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369235 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369287 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369308 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369334 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369360 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369387 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369410 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369455 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369480 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.369520 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbd4\" (UniqueName: \"kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.370840 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.371698 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.372214 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.372469 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.372523 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.372535 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.373066 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.376503 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.379154 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.380168 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.382628 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.399135 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnvct\" (UniqueName: \"kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct\") pod \"cinder-api-0\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.399378 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbd4\" (UniqueName: \"kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4\") pod \"dnsmasq-dns-6bb4fc677f-dxv9h\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.518319 4694 generic.go:334] "Generic (PLEG): container finished" podID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerID="82c26206b84b05fd48d1d0d10a01d3c57348440596310d2768be3211a70c232b" exitCode=0 Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.518383 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" event={"ID":"1313c9d0-5542-4147-a6ef-f66b09b571b4","Type":"ContainerDied","Data":"82c26206b84b05fd48d1d0d10a01d3c57348440596310d2768be3211a70c232b"} Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.520285 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerStarted","Data":"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0"} Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.521810 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.559284 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.560634 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.107994676 podStartE2EDuration="5.560597328s" podCreationTimestamp="2026-02-17 17:03:46 +0000 UTC" firstStartedPulling="2026-02-17 17:03:47.434179216 +0000 UTC m=+1295.191254540" lastFinishedPulling="2026-02-17 17:03:50.886781868 +0000 UTC m=+1298.643857192" observedRunningTime="2026-02-17 17:03:51.556898177 +0000 UTC m=+1299.313973501" watchObservedRunningTime="2026-02-17 17:03:51.560597328 +0000 UTC m=+1299.317672652" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.571742 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.584048 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.679023 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785376 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785498 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785547 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngszq\" (UniqueName: \"kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785575 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785655 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.785788 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc\") pod \"1313c9d0-5542-4147-a6ef-f66b09b571b4\" (UID: \"1313c9d0-5542-4147-a6ef-f66b09b571b4\") " Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.812660 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.813059 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq" (OuterVolumeSpecName: "kube-api-access-ngszq") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "kube-api-access-ngszq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.862208 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.866242 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.878788 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config" (OuterVolumeSpecName: "config") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.888846 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.888876 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngszq\" (UniqueName: \"kubernetes.io/projected/1313c9d0-5542-4147-a6ef-f66b09b571b4-kube-api-access-ngszq\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.888888 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.888898 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.889382 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.889649 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1313c9d0-5542-4147-a6ef-f66b09b571b4" (UID: "1313c9d0-5542-4147-a6ef-f66b09b571b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.990555 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.990585 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1313c9d0-5542-4147-a6ef-f66b09b571b4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:51 crc kubenswrapper[4694]: I0217 17:03:51.992313 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.052308 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:03:52 crc kubenswrapper[4694]: W0217 17:03:52.063678 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3142f4c0_5094_48ef_9151_77ee80fd3b41.slice/crio-136848e0df83da5dcfe890ac32dd1e84f19c680f6f77f8749cda37d035b3f055 WatchSource:0}: Error finding container 136848e0df83da5dcfe890ac32dd1e84f19c680f6f77f8749cda37d035b3f055: Status 404 returned error can't find the container with id 136848e0df83da5dcfe890ac32dd1e84f19c680f6f77f8749cda37d035b3f055 Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.170271 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.538122 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" event={"ID":"1313c9d0-5542-4147-a6ef-f66b09b571b4","Type":"ContainerDied","Data":"1d9d2ba691c9eb4b5f432bef99a01d03b99fc48d70279d229c1d057c3073d2c7"} Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.538181 4694 scope.go:117] "RemoveContainer" containerID="82c26206b84b05fd48d1d0d10a01d3c57348440596310d2768be3211a70c232b" Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.538183 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-2f59b" Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.541269 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerStarted","Data":"045a05b7717f4106de397cc04d2ac88f5bcd01baa0a6b1a741959f84f42f16ad"} Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.543489 4694 generic.go:334] "Generic (PLEG): container finished" podID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerID="527d7558fd44b5892dc84e505a5708954dcb13334b28b652122a63ed1d382550" exitCode=0 Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.543549 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" event={"ID":"3142f4c0-5094-48ef-9151-77ee80fd3b41","Type":"ContainerDied","Data":"527d7558fd44b5892dc84e505a5708954dcb13334b28b652122a63ed1d382550"} Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.547925 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" event={"ID":"3142f4c0-5094-48ef-9151-77ee80fd3b41","Type":"ContainerStarted","Data":"136848e0df83da5dcfe890ac32dd1e84f19c680f6f77f8749cda37d035b3f055"} Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.553409 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerStarted","Data":"9f75bb0f1ed5736ad36e0cae6a90347be581effa2340dbedff614eb9a5e045c3"} Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.598668 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.617727 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-2f59b"] Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.620541 4694 scope.go:117] "RemoveContainer" containerID="0d7173364ec4fcc3cc7b3d23ad8763e5ecce8d80f47f57a89717bc9f4c6ec3d2" Feb 17 17:03:52 crc kubenswrapper[4694]: I0217 17:03:52.930227 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" path="/var/lib/kubelet/pods/1313c9d0-5542-4147-a6ef-f66b09b571b4/volumes" Feb 17 17:03:53 crc kubenswrapper[4694]: I0217 17:03:53.197496 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:53 crc kubenswrapper[4694]: I0217 17:03:53.578687 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" event={"ID":"3142f4c0-5094-48ef-9151-77ee80fd3b41","Type":"ContainerStarted","Data":"6de2a7f3ec8ad114005d35a883e61ecf57e2dd74a6662179e7476023a4529232"} Feb 17 17:03:53 crc kubenswrapper[4694]: I0217 17:03:53.581775 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:03:53 crc kubenswrapper[4694]: I0217 17:03:53.596686 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerStarted","Data":"696ded4ec2b73d3c863c662332559eda9e74027d2560fdc3ca72953c7caf71f6"} Feb 17 17:03:53 crc kubenswrapper[4694]: I0217 17:03:53.630084 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" podStartSLOduration=3.6300627519999997 podStartE2EDuration="3.630062752s" podCreationTimestamp="2026-02-17 17:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:53.607882735 +0000 UTC m=+1301.364958059" watchObservedRunningTime="2026-02-17 17:03:53.630062752 +0000 UTC m=+1301.387138076" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.019372 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.052926 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.124296 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b8f4f9856-rcwl9" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.216908 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.217136 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon-log" containerID="cri-o://b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.217491 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" containerID="cri-o://3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.240987 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.333973 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.448536 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.448861 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6648f68957-f2dks" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-api" containerID="cri-o://3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.448960 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6648f68957-f2dks" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" containerID="cri-o://970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.493434 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-699759854f-bj949"] Feb 17 17:03:54 crc kubenswrapper[4694]: E0217 17:03:54.493868 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="init" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.493880 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="init" Feb 17 17:03:54 crc kubenswrapper[4694]: E0217 17:03:54.493896 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="dnsmasq-dns" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.493902 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="dnsmasq-dns" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.494053 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1313c9d0-5542-4147-a6ef-f66b09b571b4" containerName="dnsmasq-dns" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.494962 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.501450 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6648f68957-f2dks" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": EOF" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.524330 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-699759854f-bj949"] Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563367 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-httpd-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563455 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563517 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfjs\" (UniqueName: \"kubernetes.io/projected/c626c95e-85d3-4ba2-8453-060b57d2ca05-kube-api-access-kjfjs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563582 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-ovndb-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563649 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-internal-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563676 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-public-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.563722 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-combined-ca-bundle\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.630436 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerStarted","Data":"a0c6ab4d2163b02249d18319469ac056dfd71bf1966099eaf1ed5b66e2061966"} Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.630650 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api-log" containerID="cri-o://696ded4ec2b73d3c863c662332559eda9e74027d2560fdc3ca72953c7caf71f6" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.630884 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.631108 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api" containerID="cri-o://a0c6ab4d2163b02249d18319469ac056dfd71bf1966099eaf1ed5b66e2061966" gracePeriod=30 Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.638132 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerStarted","Data":"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a"} Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.668748 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.668725881 podStartE2EDuration="3.668725881s" podCreationTimestamp="2026-02-17 17:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:54.657635968 +0000 UTC m=+1302.414711292" watchObservedRunningTime="2026-02-17 17:03:54.668725881 +0000 UTC m=+1302.425801205" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670164 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670336 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfjs\" (UniqueName: \"kubernetes.io/projected/c626c95e-85d3-4ba2-8453-060b57d2ca05-kube-api-access-kjfjs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670457 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-ovndb-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670515 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-internal-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670537 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-public-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670657 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-combined-ca-bundle\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.670798 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-httpd-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.686486 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-internal-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.686812 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.688588 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-public-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.688795 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-httpd-config\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.689720 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-combined-ca-bundle\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.693278 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c626c95e-85d3-4ba2-8453-060b57d2ca05-ovndb-tls-certs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.709418 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfjs\" (UniqueName: \"kubernetes.io/projected/c626c95e-85d3-4ba2-8453-060b57d2ca05-kube-api-access-kjfjs\") pod \"neutron-699759854f-bj949\" (UID: \"c626c95e-85d3-4ba2-8453-060b57d2ca05\") " pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:54 crc kubenswrapper[4694]: I0217 17:03:54.845091 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.717745 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerStarted","Data":"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248"} Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.743857 4694 generic.go:334] "Generic (PLEG): container finished" podID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerID="a0c6ab4d2163b02249d18319469ac056dfd71bf1966099eaf1ed5b66e2061966" exitCode=0 Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.744214 4694 generic.go:334] "Generic (PLEG): container finished" podID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerID="696ded4ec2b73d3c863c662332559eda9e74027d2560fdc3ca72953c7caf71f6" exitCode=143 Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.744297 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerDied","Data":"a0c6ab4d2163b02249d18319469ac056dfd71bf1966099eaf1ed5b66e2061966"} Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.744324 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerDied","Data":"696ded4ec2b73d3c863c662332559eda9e74027d2560fdc3ca72953c7caf71f6"} Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.746035 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.661501014 podStartE2EDuration="5.746015182s" podCreationTimestamp="2026-02-17 17:03:50 +0000 UTC" firstStartedPulling="2026-02-17 17:03:51.819262591 +0000 UTC m=+1299.576337905" lastFinishedPulling="2026-02-17 17:03:52.903776749 +0000 UTC m=+1300.660852073" observedRunningTime="2026-02-17 17:03:55.741713096 +0000 UTC m=+1303.498788410" watchObservedRunningTime="2026-02-17 17:03:55.746015182 +0000 UTC m=+1303.503090506" Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.759818 4694 generic.go:334] "Generic (PLEG): container finished" podID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerID="970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04" exitCode=0 Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.760717 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerDied","Data":"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04"} Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.813377 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-699759854f-bj949"] Feb 17 17:03:55 crc kubenswrapper[4694]: W0217 17:03:55.838500 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc626c95e_85d3_4ba2_8453_060b57d2ca05.slice/crio-df1bd664666f273afa1acb9fe7f50206a15fb637975fd173f63c495d6ee2cec5 WatchSource:0}: Error finding container df1bd664666f273afa1acb9fe7f50206a15fb637975fd173f63c495d6ee2cec5: Status 404 returned error can't find the container with id df1bd664666f273afa1acb9fe7f50206a15fb637975fd173f63c495d6ee2cec5 Feb 17 17:03:55 crc kubenswrapper[4694]: I0217 17:03:55.951993 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.139696 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140525 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140596 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140697 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140779 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140876 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.140917 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnvct\" (UniqueName: \"kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct\") pod \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\" (UID: \"ca3387f7-44d5-470d-92d5-e564cd41f9d6\") " Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.143713 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.143994 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs" (OuterVolumeSpecName: "logs") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.145859 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts" (OuterVolumeSpecName: "scripts") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.150839 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.150890 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct" (OuterVolumeSpecName: "kube-api-access-nnvct") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "kube-api-access-nnvct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.175419 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.196730 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.209815 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data" (OuterVolumeSpecName: "config-data") pod "ca3387f7-44d5-470d-92d5-e564cd41f9d6" (UID: "ca3387f7-44d5-470d-92d5-e564cd41f9d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242736 4694 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242778 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242790 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242803 4694 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca3387f7-44d5-470d-92d5-e564cd41f9d6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242815 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnvct\" (UniqueName: \"kubernetes.io/projected/ca3387f7-44d5-470d-92d5-e564cd41f9d6-kube-api-access-nnvct\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242828 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3387f7-44d5-470d-92d5-e564cd41f9d6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.242839 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3387f7-44d5-470d-92d5-e564cd41f9d6-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.326211 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6648f68957-f2dks" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.771876 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699759854f-bj949" event={"ID":"c626c95e-85d3-4ba2-8453-060b57d2ca05","Type":"ContainerStarted","Data":"648e655a9386a9727eb46a1c854d00744a4e02492bfa376cfcb88063aa723381"} Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.771921 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699759854f-bj949" event={"ID":"c626c95e-85d3-4ba2-8453-060b57d2ca05","Type":"ContainerStarted","Data":"19db67828963a6c8ed53f91484b49a0e4efe50613cbbd5f909c3456e8448a71d"} Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.771933 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-699759854f-bj949" event={"ID":"c626c95e-85d3-4ba2-8453-060b57d2ca05","Type":"ContainerStarted","Data":"df1bd664666f273afa1acb9fe7f50206a15fb637975fd173f63c495d6ee2cec5"} Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.772036 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-699759854f-bj949" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.774350 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.777757 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ca3387f7-44d5-470d-92d5-e564cd41f9d6","Type":"ContainerDied","Data":"9f75bb0f1ed5736ad36e0cae6a90347be581effa2340dbedff614eb9a5e045c3"} Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.777823 4694 scope.go:117] "RemoveContainer" containerID="a0c6ab4d2163b02249d18319469ac056dfd71bf1966099eaf1ed5b66e2061966" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.806959 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-699759854f-bj949" podStartSLOduration=2.806938958 podStartE2EDuration="2.806938958s" podCreationTimestamp="2026-02-17 17:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:56.799338911 +0000 UTC m=+1304.556414235" watchObservedRunningTime="2026-02-17 17:03:56.806938958 +0000 UTC m=+1304.564014282" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.823357 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.823743 4694 scope.go:117] "RemoveContainer" containerID="696ded4ec2b73d3c863c662332559eda9e74027d2560fdc3ca72953c7caf71f6" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.830581 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.877110 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:56 crc kubenswrapper[4694]: E0217 17:03:56.877501 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api-log" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.877517 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api-log" Feb 17 17:03:56 crc kubenswrapper[4694]: E0217 17:03:56.877532 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.877539 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.877728 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.877757 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" containerName="cinder-api-log" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.879102 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.888742 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.890425 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.890721 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.891104 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 17:03:56 crc kubenswrapper[4694]: I0217 17:03:56.907637 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3387f7-44d5-470d-92d5-e564cd41f9d6" path="/var/lib/kubelet/pods/ca3387f7-44d5-470d-92d5-e564cd41f9d6/volumes" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061117 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvtpz\" (UniqueName: \"kubernetes.io/projected/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-kube-api-access-cvtpz\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061552 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-logs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061681 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061768 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061825 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061854 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061904 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.061939 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-scripts\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.062018 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.073175 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163379 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-logs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163451 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163500 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163519 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163537 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163564 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163585 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-scripts\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163647 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163674 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvtpz\" (UniqueName: \"kubernetes.io/projected/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-kube-api-access-cvtpz\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163736 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.163924 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-logs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.170844 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.170847 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.171368 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.174010 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-scripts\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.188333 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.190882 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-config-data\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.196361 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvtpz\" (UniqueName: \"kubernetes.io/projected/e5767e0e-627e-4e5a-9ee9-c150b7bc2d72-kube-api-access-cvtpz\") pod \"cinder-api-0\" (UID: \"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72\") " pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.215520 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.263108 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cff554946-ddg9w" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.360659 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.361147 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b7f697b84-ththl" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api-log" containerID="cri-o://4936027ef37a6fcfbe3af74cc10dc10e7582fdbe64eee03eacd1d8fc28f9d67d" gracePeriod=30 Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.361630 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b7f697b84-ththl" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api" containerID="cri-o://88dc13f979a48a93cf1803ec295d65fc40765248cb2d55852292f1a08bf10ffe" gracePeriod=30 Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.683550 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.695591 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:35652->10.217.0.150:8443: read: connection reset by peer" Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.797557 4694 generic.go:334] "Generic (PLEG): container finished" podID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerID="4936027ef37a6fcfbe3af74cc10dc10e7582fdbe64eee03eacd1d8fc28f9d67d" exitCode=143 Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.797718 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerDied","Data":"4936027ef37a6fcfbe3af74cc10dc10e7582fdbe64eee03eacd1d8fc28f9d67d"} Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.806051 4694 generic.go:334] "Generic (PLEG): container finished" podID="5bc102be-9643-4310-900a-c6f6803a395a" containerID="3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462" exitCode=0 Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.806086 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerDied","Data":"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462"} Feb 17 17:03:57 crc kubenswrapper[4694]: I0217 17:03:57.811484 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72","Type":"ContainerStarted","Data":"51c467a1611b37e406ce97015ea416848645580a6d559c79f8b87dd553d6a8c7"} Feb 17 17:03:58 crc kubenswrapper[4694]: I0217 17:03:58.850888 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72","Type":"ContainerStarted","Data":"92c16ff593e3840322509e78f44282226d2981d1e838b40342315a00e7003d68"} Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.422678 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.696301 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841199 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841665 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841702 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841739 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmmcb\" (UniqueName: \"kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841777 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841835 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.841888 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs\") pod \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\" (UID: \"d7bb7a42-01ac-46d8-bb50-8765a4ffd817\") " Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.847854 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.855856 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb" (OuterVolumeSpecName: "kube-api-access-lmmcb") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "kube-api-access-lmmcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.892768 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5767e0e-627e-4e5a-9ee9-c150b7bc2d72","Type":"ContainerStarted","Data":"cb2aa78cdb6dc6a97135a19ebeddda4e8993ca47d41f65b1ae1c7a1b33b03d22"} Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.892838 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.898255 4694 generic.go:334] "Generic (PLEG): container finished" podID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerID="3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf" exitCode=0 Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.898302 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerDied","Data":"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf"} Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.898322 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6648f68957-f2dks" event={"ID":"d7bb7a42-01ac-46d8-bb50-8765a4ffd817","Type":"ContainerDied","Data":"d90be3809255cce81da354e9420e3941ebce242a7ce9c97daee795dcf4519d76"} Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.898337 4694 scope.go:117] "RemoveContainer" containerID="970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.898442 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6648f68957-f2dks" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.915993 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.946859 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.946896 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmmcb\" (UniqueName: \"kubernetes.io/projected/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-kube-api-access-lmmcb\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.947120 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.954081 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.954065162 podStartE2EDuration="3.954065162s" podCreationTimestamp="2026-02-17 17:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:03:59.941876611 +0000 UTC m=+1307.698951935" watchObservedRunningTime="2026-02-17 17:03:59.954065162 +0000 UTC m=+1307.711140486" Feb 17 17:03:59 crc kubenswrapper[4694]: I0217 17:03:59.982026 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.012742 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config" (OuterVolumeSpecName: "config") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.012761 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.039121 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d7bb7a42-01ac-46d8-bb50-8765a4ffd817" (UID: "d7bb7a42-01ac-46d8-bb50-8765a4ffd817"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.048934 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.050285 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.050297 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.050306 4694 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bb7a42-01ac-46d8-bb50-8765a4ffd817-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.138102 4694 scope.go:117] "RemoveContainer" containerID="3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.171763 4694 scope.go:117] "RemoveContainer" containerID="970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04" Feb 17 17:04:00 crc kubenswrapper[4694]: E0217 17:04:00.172258 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04\": container with ID starting with 970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04 not found: ID does not exist" containerID="970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.172300 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04"} err="failed to get container status \"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04\": rpc error: code = NotFound desc = could not find container \"970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04\": container with ID starting with 970a634e0e6982d88f97ea6263e3d4cbf0a4cfa997a66c36bd3f0774fb1b2e04 not found: ID does not exist" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.172323 4694 scope.go:117] "RemoveContainer" containerID="3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf" Feb 17 17:04:00 crc kubenswrapper[4694]: E0217 17:04:00.172564 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf\": container with ID starting with 3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf not found: ID does not exist" containerID="3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.172624 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf"} err="failed to get container status \"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf\": rpc error: code = NotFound desc = could not find container \"3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf\": container with ID starting with 3200a713244917f705fcda3276c22d72e089cf748cd657f2d79a6b3e6229dcaf not found: ID does not exist" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.238590 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.253549 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6648f68957-f2dks"] Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.477891 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.544454 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.784651 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f7569cc6b-bv5js"] Feb 17 17:04:00 crc kubenswrapper[4694]: E0217 17:04:00.785433 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-api" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.785457 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-api" Feb 17 17:04:00 crc kubenswrapper[4694]: E0217 17:04:00.785481 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.785489 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.785733 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-httpd" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.785777 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" containerName="neutron-api" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.788637 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.798583 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7569cc6b-bv5js"] Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.867979 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-internal-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggrpp\" (UniqueName: \"kubernetes.io/projected/1a8ff002-04b9-4dfa-af27-36823f7918a8-kube-api-access-ggrpp\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868049 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-scripts\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868072 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a8ff002-04b9-4dfa-af27-36823f7918a8-logs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868121 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-public-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868169 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-combined-ca-bundle\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.868193 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-config-data\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.915195 4694 generic.go:334] "Generic (PLEG): container finished" podID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerID="88dc13f979a48a93cf1803ec295d65fc40765248cb2d55852292f1a08bf10ffe" exitCode=0 Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.915894 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb7a42-01ac-46d8-bb50-8765a4ffd817" path="/var/lib/kubelet/pods/d7bb7a42-01ac-46d8-bb50-8765a4ffd817/volumes" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.916697 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerDied","Data":"88dc13f979a48a93cf1803ec295d65fc40765248cb2d55852292f1a08bf10ffe"} Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.969221 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-public-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.969304 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-combined-ca-bundle\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.969928 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-config-data\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.970098 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-internal-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.970173 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrpp\" (UniqueName: \"kubernetes.io/projected/1a8ff002-04b9-4dfa-af27-36823f7918a8-kube-api-access-ggrpp\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.970203 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-scripts\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.970254 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a8ff002-04b9-4dfa-af27-36823f7918a8-logs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.971253 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a8ff002-04b9-4dfa-af27-36823f7918a8-logs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.975907 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-internal-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.975983 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-public-tls-certs\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.976961 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-config-data\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.978200 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-scripts\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.978342 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8ff002-04b9-4dfa-af27-36823f7918a8-combined-ca-bundle\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:00 crc kubenswrapper[4694]: I0217 17:04:00.994889 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggrpp\" (UniqueName: \"kubernetes.io/projected/1a8ff002-04b9-4dfa-af27-36823f7918a8-kube-api-access-ggrpp\") pod \"placement-7f7569cc6b-bv5js\" (UID: \"1a8ff002-04b9-4dfa-af27-36823f7918a8\") " pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.110588 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.199841 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.274219 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom\") pod \"a450a10f-4986-4803-ac6c-0507e25ada5a\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.274261 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szkz\" (UniqueName: \"kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz\") pod \"a450a10f-4986-4803-ac6c-0507e25ada5a\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.274318 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data\") pod \"a450a10f-4986-4803-ac6c-0507e25ada5a\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.274358 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs\") pod \"a450a10f-4986-4803-ac6c-0507e25ada5a\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.274430 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle\") pod \"a450a10f-4986-4803-ac6c-0507e25ada5a\" (UID: \"a450a10f-4986-4803-ac6c-0507e25ada5a\") " Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.275883 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs" (OuterVolumeSpecName: "logs") pod "a450a10f-4986-4803-ac6c-0507e25ada5a" (UID: "a450a10f-4986-4803-ac6c-0507e25ada5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.276897 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a450a10f-4986-4803-ac6c-0507e25ada5a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.281319 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a450a10f-4986-4803-ac6c-0507e25ada5a" (UID: "a450a10f-4986-4803-ac6c-0507e25ada5a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.295852 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz" (OuterVolumeSpecName: "kube-api-access-8szkz") pod "a450a10f-4986-4803-ac6c-0507e25ada5a" (UID: "a450a10f-4986-4803-ac6c-0507e25ada5a"). InnerVolumeSpecName "kube-api-access-8szkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.380399 4694 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.380450 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szkz\" (UniqueName: \"kubernetes.io/projected/a450a10f-4986-4803-ac6c-0507e25ada5a-kube-api-access-8szkz\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.398164 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a450a10f-4986-4803-ac6c-0507e25ada5a" (UID: "a450a10f-4986-4803-ac6c-0507e25ada5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.406172 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data" (OuterVolumeSpecName: "config-data") pod "a450a10f-4986-4803-ac6c-0507e25ada5a" (UID: "a450a10f-4986-4803-ac6c-0507e25ada5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.482767 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.483121 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a450a10f-4986-4803-ac6c-0507e25ada5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.507234 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.573793 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.598398 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.657215 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.657507 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="dnsmasq-dns" containerID="cri-o://4756bc878e03dc78aa60383f13e47ebd8d18069642910440443aa32fb9c22a94" gracePeriod=10 Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.688807 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7569cc6b-bv5js"] Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.931209 4694 generic.go:334] "Generic (PLEG): container finished" podID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerID="4756bc878e03dc78aa60383f13e47ebd8d18069642910440443aa32fb9c22a94" exitCode=0 Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.931272 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" event={"ID":"4962ca9a-0d86-4074-b50e-14ded17f8c4d","Type":"ContainerDied","Data":"4756bc878e03dc78aa60383f13e47ebd8d18069642910440443aa32fb9c22a94"} Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.932907 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b7f697b84-ththl" event={"ID":"a450a10f-4986-4803-ac6c-0507e25ada5a","Type":"ContainerDied","Data":"71089725c118b5da327ff0c1983521ba9db14af5156939c1c384483a3eb79b6b"} Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.932959 4694 scope.go:117] "RemoveContainer" containerID="88dc13f979a48a93cf1803ec295d65fc40765248cb2d55852292f1a08bf10ffe" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.933103 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b7f697b84-ththl" Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.939503 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="cinder-scheduler" containerID="cri-o://f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a" gracePeriod=30 Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.939931 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7569cc6b-bv5js" event={"ID":"1a8ff002-04b9-4dfa-af27-36823f7918a8","Type":"ContainerStarted","Data":"a004ae89b124f69dc823909df1c54168c2da12b2819603d4a2a6210a96fd6bf9"} Feb 17 17:04:01 crc kubenswrapper[4694]: I0217 17:04:01.940271 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="probe" containerID="cri-o://a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248" gracePeriod=30 Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.005041 4694 scope.go:117] "RemoveContainer" containerID="4936027ef37a6fcfbe3af74cc10dc10e7582fdbe64eee03eacd1d8fc28f9d67d" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.026927 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.035031 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-b7f697b84-ththl"] Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.296219 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.411751 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.411804 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfq4n\" (UniqueName: \"kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.411945 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.412064 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.412120 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.412143 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.416967 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n" (OuterVolumeSpecName: "kube-api-access-kfq4n") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "kube-api-access-kfq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.483979 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.502097 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config" (OuterVolumeSpecName: "config") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.506829 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.507156 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.512853 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513433 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") pod \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\" (UID: \"4962ca9a-0d86-4074-b50e-14ded17f8c4d\") " Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513831 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513845 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513854 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513863 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfq4n\" (UniqueName: \"kubernetes.io/projected/4962ca9a-0d86-4074-b50e-14ded17f8c4d-kube-api-access-kfq4n\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513871 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: W0217 17:04:02.513939 4694 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4962ca9a-0d86-4074-b50e-14ded17f8c4d/volumes/kubernetes.io~configmap/dns-swift-storage-0 Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.513949 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4962ca9a-0d86-4074-b50e-14ded17f8c4d" (UID: "4962ca9a-0d86-4074-b50e-14ded17f8c4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.615322 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4962ca9a-0d86-4074-b50e-14ded17f8c4d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.905738 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" path="/var/lib/kubelet/pods/a450a10f-4986-4803-ac6c-0507e25ada5a/volumes" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.954565 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7569cc6b-bv5js" event={"ID":"1a8ff002-04b9-4dfa-af27-36823f7918a8","Type":"ContainerStarted","Data":"9bafecd0474c72d71021e05559e5c918d35fd5438b5ac09ee4bfc916f97c07f1"} Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.954981 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7569cc6b-bv5js" event={"ID":"1a8ff002-04b9-4dfa-af27-36823f7918a8","Type":"ContainerStarted","Data":"3bb2a7c9db09b2721edb9142d9baa82b08a860cc1d86bd6abef31aa7fec25537"} Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.955001 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.955013 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.956512 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" event={"ID":"4962ca9a-0d86-4074-b50e-14ded17f8c4d","Type":"ContainerDied","Data":"11ef69e645d971e34fce061c50c0a2321a1a56c642c2f0317129c05432af3d8e"} Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.956523 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-s9jhl" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.956565 4694 scope.go:117] "RemoveContainer" containerID="4756bc878e03dc78aa60383f13e47ebd8d18069642910440443aa32fb9c22a94" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.968289 4694 generic.go:334] "Generic (PLEG): container finished" podID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerID="a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248" exitCode=0 Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.968374 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerDied","Data":"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248"} Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.979948 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f7569cc6b-bv5js" podStartSLOduration=2.979929349 podStartE2EDuration="2.979929349s" podCreationTimestamp="2026-02-17 17:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:02.97387981 +0000 UTC m=+1310.730955144" watchObservedRunningTime="2026-02-17 17:04:02.979929349 +0000 UTC m=+1310.737004673" Feb 17 17:04:02 crc kubenswrapper[4694]: I0217 17:04:02.989803 4694 scope.go:117] "RemoveContainer" containerID="f0092ae2c5acf0cff1d4b5ff7bb31cbc70fe22f43561100e07e72fe583075853" Feb 17 17:04:03 crc kubenswrapper[4694]: I0217 17:04:03.008658 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:04:03 crc kubenswrapper[4694]: I0217 17:04:03.024074 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-s9jhl"] Feb 17 17:04:04 crc kubenswrapper[4694]: I0217 17:04:04.915783 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" path="/var/lib/kubelet/pods/4962ca9a-0d86-4074-b50e-14ded17f8c4d/volumes" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.550099 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8667649c99-28rzh" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.755732 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895100 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895164 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895200 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55nsz\" (UniqueName: \"kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895235 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895252 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895285 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.895452 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom\") pod \"0902cbac-17af-4207-ae3e-3f5fd8544c17\" (UID: \"0902cbac-17af-4207-ae3e-3f5fd8544c17\") " Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.897878 4694 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0902cbac-17af-4207-ae3e-3f5fd8544c17-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.901020 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts" (OuterVolumeSpecName: "scripts") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.901505 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.904051 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz" (OuterVolumeSpecName: "kube-api-access-55nsz") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "kube-api-access-55nsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.955023 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.999907 4694 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.999942 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.999954 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55nsz\" (UniqueName: \"kubernetes.io/projected/0902cbac-17af-4207-ae3e-3f5fd8544c17-kube-api-access-55nsz\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:06 crc kubenswrapper[4694]: I0217 17:04:06.999967 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.000039 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data" (OuterVolumeSpecName: "config-data") pod "0902cbac-17af-4207-ae3e-3f5fd8544c17" (UID: "0902cbac-17af-4207-ae3e-3f5fd8544c17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.016934 4694 generic.go:334] "Generic (PLEG): container finished" podID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerID="f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a" exitCode=0 Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.016984 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerDied","Data":"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a"} Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.017014 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0902cbac-17af-4207-ae3e-3f5fd8544c17","Type":"ContainerDied","Data":"045a05b7717f4106de397cc04d2ac88f5bcd01baa0a6b1a741959f84f42f16ad"} Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.017037 4694 scope.go:117] "RemoveContainer" containerID="a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.017231 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.054524 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.056383 4694 scope.go:117] "RemoveContainer" containerID="f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.069912 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084219 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084656 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="cinder-scheduler" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084676 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="cinder-scheduler" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084689 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="dnsmasq-dns" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084696 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="dnsmasq-dns" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084708 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084714 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084724 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api-log" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084730 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api-log" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084743 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="init" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084748 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="init" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.084757 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="probe" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084762 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="probe" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084929 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4962ca9a-0d86-4074-b50e-14ded17f8c4d" containerName="dnsmasq-dns" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084944 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084955 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a450a10f-4986-4803-ac6c-0507e25ada5a" containerName="barbican-api-log" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084964 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="probe" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.084977 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" containerName="cinder-scheduler" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.085519 4694 scope.go:117] "RemoveContainer" containerID="a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.085968 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.086031 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248\": container with ID starting with a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248 not found: ID does not exist" containerID="a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.086058 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248"} err="failed to get container status \"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248\": rpc error: code = NotFound desc = could not find container \"a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248\": container with ID starting with a7dd5a27e4190c45b0b6b5c3578f46b512583b1ccc4e7bd10e4a58a6377be248 not found: ID does not exist" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.086083 4694 scope.go:117] "RemoveContainer" containerID="f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a" Feb 17 17:04:07 crc kubenswrapper[4694]: E0217 17:04:07.086294 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a\": container with ID starting with f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a not found: ID does not exist" containerID="f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.086330 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a"} err="failed to get container status \"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a\": rpc error: code = NotFound desc = could not find container \"f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a\": container with ID starting with f056561fac446a2f192d3791492a3bb95e6dd93c82d744a5343217d089a3d02a not found: ID does not exist" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.087866 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.101860 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0902cbac-17af-4207-ae3e-3f5fd8544c17-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.127741 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203339 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203438 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ccs\" (UniqueName: \"kubernetes.io/projected/83a8e274-5312-4be8-9f81-c7b13a2effe1-kube-api-access-p9ccs\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203534 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83a8e274-5312-4be8-9f81-c7b13a2effe1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203575 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203638 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.203685 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-scripts\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.304908 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.305000 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ccs\" (UniqueName: \"kubernetes.io/projected/83a8e274-5312-4be8-9f81-c7b13a2effe1-kube-api-access-p9ccs\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.305080 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83a8e274-5312-4be8-9f81-c7b13a2effe1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.305108 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.305146 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.305194 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-scripts\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.307211 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83a8e274-5312-4be8-9f81-c7b13a2effe1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.309331 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-scripts\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.310309 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.311513 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.314979 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83a8e274-5312-4be8-9f81-c7b13a2effe1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.326633 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ccs\" (UniqueName: \"kubernetes.io/projected/83a8e274-5312-4be8-9f81-c7b13a2effe1-kube-api-access-p9ccs\") pod \"cinder-scheduler-0\" (UID: \"83a8e274-5312-4be8-9f81-c7b13a2effe1\") " pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.408374 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 17:04:07 crc kubenswrapper[4694]: I0217 17:04:07.872503 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 17:04:07 crc kubenswrapper[4694]: W0217 17:04:07.876352 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83a8e274_5312_4be8_9f81_c7b13a2effe1.slice/crio-00e2ac9c3ffd634bb1a3c2f40342618ebfddcbf6f0c07c05ff0a1da5b0b78137 WatchSource:0}: Error finding container 00e2ac9c3ffd634bb1a3c2f40342618ebfddcbf6f0c07c05ff0a1da5b0b78137: Status 404 returned error can't find the container with id 00e2ac9c3ffd634bb1a3c2f40342618ebfddcbf6f0c07c05ff0a1da5b0b78137 Feb 17 17:04:08 crc kubenswrapper[4694]: I0217 17:04:08.029650 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83a8e274-5312-4be8-9f81-c7b13a2effe1","Type":"ContainerStarted","Data":"00e2ac9c3ffd634bb1a3c2f40342618ebfddcbf6f0c07c05ff0a1da5b0b78137"} Feb 17 17:04:08 crc kubenswrapper[4694]: I0217 17:04:08.907374 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0902cbac-17af-4207-ae3e-3f5fd8544c17" path="/var/lib/kubelet/pods/0902cbac-17af-4207-ae3e-3f5fd8544c17/volumes" Feb 17 17:04:09 crc kubenswrapper[4694]: I0217 17:04:09.058346 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83a8e274-5312-4be8-9f81-c7b13a2effe1","Type":"ContainerStarted","Data":"2f170afd09762c936036e7dcde8e01ef216ca246789cc62d12a25cc2a735ceb6"} Feb 17 17:04:09 crc kubenswrapper[4694]: I0217 17:04:09.058423 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"83a8e274-5312-4be8-9f81-c7b13a2effe1","Type":"ContainerStarted","Data":"47995cff16f5a574cf0111210b0e5a086b18c7b321365fef9c7918998f9737e5"} Feb 17 17:04:09 crc kubenswrapper[4694]: I0217 17:04:09.077480 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.077458009 podStartE2EDuration="2.077458009s" podCreationTimestamp="2026-02-17 17:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:09.074178058 +0000 UTC m=+1316.831253382" watchObservedRunningTime="2026-02-17 17:04:09.077458009 +0000 UTC m=+1316.834533333" Feb 17 17:04:09 crc kubenswrapper[4694]: I0217 17:04:09.291727 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 17:04:09 crc kubenswrapper[4694]: I0217 17:04:09.423037 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.284999 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.286576 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.291646 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.291897 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.293234 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7zvkr" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.301056 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.392581 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.392672 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.392723 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.392758 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzrc\" (UniqueName: \"kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.494306 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.494400 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.494444 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.494480 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzrc\" (UniqueName: \"kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.496874 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.502371 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.503578 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.518422 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzrc\" (UniqueName: \"kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc\") pod \"openstackclient\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.592395 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.593267 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.612673 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.639108 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.640468 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.650839 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.702807 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqsv\" (UniqueName: \"kubernetes.io/projected/30281bdf-b35e-4ac1-8cde-8a333e24f564-kube-api-access-5cqsv\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.702936 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-combined-ca-bundle\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.702969 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config-secret\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.703045 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: E0217 17:04:11.758134 4694 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 17:04:11 crc kubenswrapper[4694]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_df703e87-37f1-4fee-aa47-e7098d2ed66f_0(d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89" Netns:"/var/run/netns/517f0c1c-6af5-44a9-b7d4-f565e8547267" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89;K8S_POD_UID=df703e87-37f1-4fee-aa47-e7098d2ed66f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/df703e87-37f1-4fee-aa47-e7098d2ed66f]: expected pod UID "df703e87-37f1-4fee-aa47-e7098d2ed66f" but got "30281bdf-b35e-4ac1-8cde-8a333e24f564" from Kube API Feb 17 17:04:11 crc kubenswrapper[4694]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:04:11 crc kubenswrapper[4694]: > Feb 17 17:04:11 crc kubenswrapper[4694]: E0217 17:04:11.758382 4694 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 17:04:11 crc kubenswrapper[4694]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_df703e87-37f1-4fee-aa47-e7098d2ed66f_0(d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89" Netns:"/var/run/netns/517f0c1c-6af5-44a9-b7d4-f565e8547267" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=d1c7991496232f6b6f6a92b58585af5a06a0c0b1eda39037a7daa1ffed214f89;K8S_POD_UID=df703e87-37f1-4fee-aa47-e7098d2ed66f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/df703e87-37f1-4fee-aa47-e7098d2ed66f]: expected pod UID "df703e87-37f1-4fee-aa47-e7098d2ed66f" but got "30281bdf-b35e-4ac1-8cde-8a333e24f564" from Kube API Feb 17 17:04:11 crc kubenswrapper[4694]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 17:04:11 crc kubenswrapper[4694]: > pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.806835 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqsv\" (UniqueName: \"kubernetes.io/projected/30281bdf-b35e-4ac1-8cde-8a333e24f564-kube-api-access-5cqsv\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.806962 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-combined-ca-bundle\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.806994 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config-secret\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.807038 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.808378 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.812416 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-openstack-config-secret\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.816062 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30281bdf-b35e-4ac1-8cde-8a333e24f564-combined-ca-bundle\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:11 crc kubenswrapper[4694]: I0217 17:04:11.822536 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqsv\" (UniqueName: \"kubernetes.io/projected/30281bdf-b35e-4ac1-8cde-8a333e24f564-kube-api-access-5cqsv\") pod \"openstackclient\" (UID: \"30281bdf-b35e-4ac1-8cde-8a333e24f564\") " pod="openstack/openstackclient" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.083283 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.085480 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.092712 4694 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="df703e87-37f1-4fee-aa47-e7098d2ed66f" podUID="30281bdf-b35e-4ac1-8cde-8a333e24f564" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.098171 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.162129 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.162426 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-central-agent" containerID="cri-o://cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649" gracePeriod=30 Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.162543 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-notification-agent" containerID="cri-o://bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd" gracePeriod=30 Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.162527 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="proxy-httpd" containerID="cri-o://649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0" gracePeriod=30 Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.162520 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="sg-core" containerID="cri-o://8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480" gracePeriod=30 Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.167922 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.215496 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrzrc\" (UniqueName: \"kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc\") pod \"df703e87-37f1-4fee-aa47-e7098d2ed66f\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.215922 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle\") pod \"df703e87-37f1-4fee-aa47-e7098d2ed66f\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.215955 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret\") pod \"df703e87-37f1-4fee-aa47-e7098d2ed66f\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.216049 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config\") pod \"df703e87-37f1-4fee-aa47-e7098d2ed66f\" (UID: \"df703e87-37f1-4fee-aa47-e7098d2ed66f\") " Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.223050 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc" (OuterVolumeSpecName: "kube-api-access-jrzrc") pod "df703e87-37f1-4fee-aa47-e7098d2ed66f" (UID: "df703e87-37f1-4fee-aa47-e7098d2ed66f"). InnerVolumeSpecName "kube-api-access-jrzrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.224673 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "df703e87-37f1-4fee-aa47-e7098d2ed66f" (UID: "df703e87-37f1-4fee-aa47-e7098d2ed66f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.231346 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "df703e87-37f1-4fee-aa47-e7098d2ed66f" (UID: "df703e87-37f1-4fee-aa47-e7098d2ed66f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.232123 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df703e87-37f1-4fee-aa47-e7098d2ed66f" (UID: "df703e87-37f1-4fee-aa47-e7098d2ed66f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.318032 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrzrc\" (UniqueName: \"kubernetes.io/projected/df703e87-37f1-4fee-aa47-e7098d2ed66f-kube-api-access-jrzrc\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.318276 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.318285 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.318308 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df703e87-37f1-4fee-aa47-e7098d2ed66f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.408995 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.555298 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 17:04:12 crc kubenswrapper[4694]: W0217 17:04:12.563275 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30281bdf_b35e_4ac1_8cde_8a333e24f564.slice/crio-57669c8a8b69d4f895f8874239710283e2e4312f8bcde82609d4a34df141e04b WatchSource:0}: Error finding container 57669c8a8b69d4f895f8874239710283e2e4312f8bcde82609d4a34df141e04b: Status 404 returned error can't find the container with id 57669c8a8b69d4f895f8874239710283e2e4312f8bcde82609d4a34df141e04b Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.668088 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9f54df747-vdnkk"] Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.669747 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.672577 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.672744 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.673498 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.686067 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9f54df747-vdnkk"] Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826447 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lths\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-kube-api-access-6lths\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826544 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-internal-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826586 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-etc-swift\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826644 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-run-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826775 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-combined-ca-bundle\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826821 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-log-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826895 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-config-data\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.826919 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-public-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.924431 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df703e87-37f1-4fee-aa47-e7098d2ed66f" path="/var/lib/kubelet/pods/df703e87-37f1-4fee-aa47-e7098d2ed66f/volumes" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928459 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-etc-swift\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928560 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-run-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928665 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-combined-ca-bundle\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928708 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-log-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928787 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-config-data\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928830 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-public-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928907 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lths\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-kube-api-access-6lths\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.928978 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-internal-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.929833 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-run-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.930204 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0f37d92-d923-43f2-807f-d52cd9003a2c-log-httpd\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.932798 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.932901 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.933049 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.936679 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-combined-ca-bundle\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.943571 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-internal-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.943736 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-public-tls-certs\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.945277 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f37d92-d923-43f2-807f-d52cd9003a2c-config-data\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.945544 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-etc-swift\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.957990 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lths\" (UniqueName: \"kubernetes.io/projected/c0f37d92-d923-43f2-807f-d52cd9003a2c-kube-api-access-6lths\") pod \"swift-proxy-9f54df747-vdnkk\" (UID: \"c0f37d92-d923-43f2-807f-d52cd9003a2c\") " pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:12 crc kubenswrapper[4694]: I0217 17:04:12.989464 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.113743 4694 generic.go:334] "Generic (PLEG): container finished" podID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerID="649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0" exitCode=0 Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.114097 4694 generic.go:334] "Generic (PLEG): container finished" podID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerID="8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480" exitCode=2 Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.114106 4694 generic.go:334] "Generic (PLEG): container finished" podID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerID="cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649" exitCode=0 Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.113861 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerDied","Data":"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0"} Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.114201 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerDied","Data":"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480"} Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.114238 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerDied","Data":"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649"} Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.116439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"30281bdf-b35e-4ac1-8cde-8a333e24f564","Type":"ContainerStarted","Data":"57669c8a8b69d4f895f8874239710283e2e4312f8bcde82609d4a34df141e04b"} Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.116496 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.123500 4694 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="df703e87-37f1-4fee-aa47-e7098d2ed66f" podUID="30281bdf-b35e-4ac1-8cde-8a333e24f564" Feb 17 17:04:13 crc kubenswrapper[4694]: I0217 17:04:13.534473 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9f54df747-vdnkk"] Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.125488 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f54df747-vdnkk" event={"ID":"c0f37d92-d923-43f2-807f-d52cd9003a2c","Type":"ContainerStarted","Data":"74397913032e30933fdf0562ae26c9134239f5318429dabfe26ef934d3efc182"} Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.126168 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f54df747-vdnkk" event={"ID":"c0f37d92-d923-43f2-807f-d52cd9003a2c","Type":"ContainerStarted","Data":"5e3b55b97c07c8d90b4b5d724075e03784ca2f8e469ba160182560bfaa9f9d4b"} Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.126294 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.126370 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9f54df747-vdnkk" event={"ID":"c0f37d92-d923-43f2-807f-d52cd9003a2c","Type":"ContainerStarted","Data":"e7ea133006bd5e4eb3cd3a8bd21bdae56597ebb77ade4e263df75cbcb2733e29"} Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.126689 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.150095 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9f54df747-vdnkk" podStartSLOduration=2.1500696 podStartE2EDuration="2.1500696s" podCreationTimestamp="2026-02-17 17:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:14.147051205 +0000 UTC m=+1321.904126529" watchObservedRunningTime="2026-02-17 17:04:14.1500696 +0000 UTC m=+1321.907144924" Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.618390 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:04:14 crc kubenswrapper[4694]: I0217 17:04:14.618452 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:04:16 crc kubenswrapper[4694]: I0217 17:04:16.870034 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.015803 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.015871 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz77m\" (UniqueName: \"kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.015899 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.015929 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.016018 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.016039 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.016064 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.016923 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.017153 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.035781 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts" (OuterVolumeSpecName: "scripts") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.036161 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m" (OuterVolumeSpecName: "kube-api-access-bz77m") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "kube-api-access-bz77m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.048808 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.099980 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.117266 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data" (OuterVolumeSpecName: "config-data") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.117921 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") pod \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\" (UID: \"8eacc3da-f0eb-4c87-a981-f1cc3f180e37\") " Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118382 4694 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118398 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118407 4694 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118416 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz77m\" (UniqueName: \"kubernetes.io/projected/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-kube-api-access-bz77m\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118425 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118436 4694 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: W0217 17:04:17.118513 4694 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8eacc3da-f0eb-4c87-a981-f1cc3f180e37/volumes/kubernetes.io~secret/config-data Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.118527 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data" (OuterVolumeSpecName: "config-data") pod "8eacc3da-f0eb-4c87-a981-f1cc3f180e37" (UID: "8eacc3da-f0eb-4c87-a981-f1cc3f180e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.198878 4694 generic.go:334] "Generic (PLEG): container finished" podID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerID="bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd" exitCode=0 Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.198924 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerDied","Data":"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd"} Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.198950 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eacc3da-f0eb-4c87-a981-f1cc3f180e37","Type":"ContainerDied","Data":"b57b4af1eb7b3d88bdbedf8a70780109c2f25a630a3996819fa085d4b987f36c"} Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.198965 4694 scope.go:117] "RemoveContainer" containerID="649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.199097 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.219545 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eacc3da-f0eb-4c87-a981-f1cc3f180e37-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.233837 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.254948 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.266833 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:17 crc kubenswrapper[4694]: E0217 17:04:17.267297 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-notification-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267320 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-notification-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: E0217 17:04:17.267356 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="proxy-httpd" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267366 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="proxy-httpd" Feb 17 17:04:17 crc kubenswrapper[4694]: E0217 17:04:17.267380 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="sg-core" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267388 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="sg-core" Feb 17 17:04:17 crc kubenswrapper[4694]: E0217 17:04:17.267419 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-central-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267427 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-central-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267683 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-notification-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267708 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="proxy-httpd" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267722 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="ceilometer-central-agent" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.267743 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" containerName="sg-core" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.269571 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.272424 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.273073 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.277714 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321132 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321334 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321380 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321411 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mscxv\" (UniqueName: \"kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321444 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321671 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.321802 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.423934 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424217 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424249 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mscxv\" (UniqueName: \"kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424269 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424322 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424370 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424393 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.424802 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.425093 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.429800 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.429799 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.430156 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.431458 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.442435 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mscxv\" (UniqueName: \"kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv\") pod \"ceilometer-0\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.595931 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:17 crc kubenswrapper[4694]: I0217 17:04:17.639687 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 17:04:18 crc kubenswrapper[4694]: I0217 17:04:18.914786 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eacc3da-f0eb-4c87-a981-f1cc3f180e37" path="/var/lib/kubelet/pods/8eacc3da-f0eb-4c87-a981-f1cc3f180e37/volumes" Feb 17 17:04:19 crc kubenswrapper[4694]: I0217 17:04:19.422857 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757dbcd46d-pw2kl" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.001482 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.009366 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9f54df747-vdnkk" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.015315 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.297825 4694 scope.go:117] "RemoveContainer" containerID="8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.354999 4694 scope.go:117] "RemoveContainer" containerID="bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.478244 4694 scope.go:117] "RemoveContainer" containerID="cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.525299 4694 scope.go:117] "RemoveContainer" containerID="649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0" Feb 17 17:04:23 crc kubenswrapper[4694]: E0217 17:04:23.525879 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0\": container with ID starting with 649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0 not found: ID does not exist" containerID="649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.525929 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0"} err="failed to get container status \"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0\": rpc error: code = NotFound desc = could not find container \"649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0\": container with ID starting with 649cded5b5011d05dcfca8e73533c38cddcc10a36a6252bbd69f3a050eaa8ff0 not found: ID does not exist" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.525981 4694 scope.go:117] "RemoveContainer" containerID="8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480" Feb 17 17:04:23 crc kubenswrapper[4694]: E0217 17:04:23.526273 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480\": container with ID starting with 8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480 not found: ID does not exist" containerID="8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.526294 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480"} err="failed to get container status \"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480\": rpc error: code = NotFound desc = could not find container \"8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480\": container with ID starting with 8d9b3583b3169dba67a28cab63761ce972bae06abc4b31df3305a98cd7b8c480 not found: ID does not exist" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.526312 4694 scope.go:117] "RemoveContainer" containerID="bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd" Feb 17 17:04:23 crc kubenswrapper[4694]: E0217 17:04:23.526517 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd\": container with ID starting with bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd not found: ID does not exist" containerID="bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.526541 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd"} err="failed to get container status \"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd\": rpc error: code = NotFound desc = could not find container \"bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd\": container with ID starting with bdd05593349af580d4ccb34580cff43b88ef79b8bf6d31fbe0b5bf48a51880bd not found: ID does not exist" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.526557 4694 scope.go:117] "RemoveContainer" containerID="cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649" Feb 17 17:04:23 crc kubenswrapper[4694]: E0217 17:04:23.526774 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649\": container with ID starting with cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649 not found: ID does not exist" containerID="cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.526794 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649"} err="failed to get container status \"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649\": rpc error: code = NotFound desc = could not find container \"cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649\": container with ID starting with cf97b7ba67936399cc0777f9a5ed5ca05a8150c0d31d12844a3a1991819d9649 not found: ID does not exist" Feb 17 17:04:23 crc kubenswrapper[4694]: I0217 17:04:23.861465 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:23 crc kubenswrapper[4694]: W0217 17:04:23.877842 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b82b717_82c4_42c1_897a_d953cd9f6e2e.slice/crio-8f3b641bf769448cba1eca39639c18e14c165e7b59c42c6d269230c231e505ad WatchSource:0}: Error finding container 8f3b641bf769448cba1eca39639c18e14c165e7b59c42c6d269230c231e505ad: Status 404 returned error can't find the container with id 8f3b641bf769448cba1eca39639c18e14c165e7b59c42c6d269230c231e505ad Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.258509 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerStarted","Data":"8f3b641bf769448cba1eca39639c18e14c165e7b59c42c6d269230c231e505ad"} Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.260018 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"30281bdf-b35e-4ac1-8cde-8a333e24f564","Type":"ContainerStarted","Data":"59f4e8f75a75dd164e958327eeb4437e2c62cf81270762102b4e4e48ff1d4d33"} Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.284945 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.486712948 podStartE2EDuration="13.284924005s" podCreationTimestamp="2026-02-17 17:04:11 +0000 UTC" firstStartedPulling="2026-02-17 17:04:12.565270446 +0000 UTC m=+1320.322345770" lastFinishedPulling="2026-02-17 17:04:23.363481513 +0000 UTC m=+1331.120556827" observedRunningTime="2026-02-17 17:04:24.282542766 +0000 UTC m=+1332.039618100" watchObservedRunningTime="2026-02-17 17:04:24.284924005 +0000 UTC m=+1332.041999329" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.725851 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.778744 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779135 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779166 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779197 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779229 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d49lb\" (UniqueName: \"kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779334 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.779386 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts\") pod \"5bc102be-9643-4310-900a-c6f6803a395a\" (UID: \"5bc102be-9643-4310-900a-c6f6803a395a\") " Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.785477 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb" (OuterVolumeSpecName: "kube-api-access-d49lb") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "kube-api-access-d49lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.785491 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.793282 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs" (OuterVolumeSpecName: "logs") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.804758 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data" (OuterVolumeSpecName: "config-data") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.808889 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts" (OuterVolumeSpecName: "scripts") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.812218 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.859173 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5bc102be-9643-4310-900a-c6f6803a395a" (UID: "5bc102be-9643-4310-900a-c6f6803a395a"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.862284 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-699759854f-bj949" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881434 4694 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881467 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bc102be-9643-4310-900a-c6f6803a395a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881477 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881488 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d49lb\" (UniqueName: \"kubernetes.io/projected/5bc102be-9643-4310-900a-c6f6803a395a-kube-api-access-d49lb\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881497 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881508 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bc102be-9643-4310-900a-c6f6803a395a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.881516 4694 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5bc102be-9643-4310-900a-c6f6803a395a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.926048 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.926265 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c4845f94d-rb96f" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-api" containerID="cri-o://e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331" gracePeriod=30 Feb 17 17:04:24 crc kubenswrapper[4694]: I0217 17:04:24.926642 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c4845f94d-rb96f" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-httpd" containerID="cri-o://1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988" gracePeriod=30 Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.273788 4694 generic.go:334] "Generic (PLEG): container finished" podID="5bc102be-9643-4310-900a-c6f6803a395a" containerID="b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57" exitCode=137 Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.273862 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerDied","Data":"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57"} Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.273895 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757dbcd46d-pw2kl" event={"ID":"5bc102be-9643-4310-900a-c6f6803a395a","Type":"ContainerDied","Data":"9716fa20f0d3ceeb6f2f08251d8443dd46d5ce0d9e00178fca9ee3d863432d1d"} Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.273914 4694 scope.go:117] "RemoveContainer" containerID="3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.274068 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757dbcd46d-pw2kl" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.281062 4694 generic.go:334] "Generic (PLEG): container finished" podID="abc565ee-1969-40b4-874f-1b71f43a8972" containerID="1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988" exitCode=0 Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.281276 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerDied","Data":"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988"} Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.292364 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerStarted","Data":"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77"} Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.292672 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerStarted","Data":"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065"} Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.300041 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.309186 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-757dbcd46d-pw2kl"] Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.443763 4694 scope.go:117] "RemoveContainer" containerID="b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.468526 4694 scope.go:117] "RemoveContainer" containerID="3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462" Feb 17 17:04:25 crc kubenswrapper[4694]: E0217 17:04:25.472203 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462\": container with ID starting with 3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462 not found: ID does not exist" containerID="3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.472244 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462"} err="failed to get container status \"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462\": rpc error: code = NotFound desc = could not find container \"3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462\": container with ID starting with 3268c6bb9e51776dd6e2b54040ed49dbf6f3b786e12beab7edccb82c350bc462 not found: ID does not exist" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.472268 4694 scope.go:117] "RemoveContainer" containerID="b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57" Feb 17 17:04:25 crc kubenswrapper[4694]: E0217 17:04:25.472809 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57\": container with ID starting with b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57 not found: ID does not exist" containerID="b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57" Feb 17 17:04:25 crc kubenswrapper[4694]: I0217 17:04:25.472880 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57"} err="failed to get container status \"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57\": rpc error: code = NotFound desc = could not find container \"b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57\": container with ID starting with b34a137bb624cd468d51c24a63f013807d15359c6fc8d2a99f3ab630900cad57 not found: ID does not exist" Feb 17 17:04:26 crc kubenswrapper[4694]: I0217 17:04:26.302150 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerStarted","Data":"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1"} Feb 17 17:04:26 crc kubenswrapper[4694]: I0217 17:04:26.914102 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc102be-9643-4310-900a-c6f6803a395a" path="/var/lib/kubelet/pods/5bc102be-9643-4310-900a-c6f6803a395a/volumes" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.298664 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.320347 4694 generic.go:334] "Generic (PLEG): container finished" podID="abc565ee-1969-40b4-874f-1b71f43a8972" containerID="e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331" exitCode=0 Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.320406 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c4845f94d-rb96f" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.320429 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerDied","Data":"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331"} Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.320500 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c4845f94d-rb96f" event={"ID":"abc565ee-1969-40b4-874f-1b71f43a8972","Type":"ContainerDied","Data":"8ae8d6fde4f8c3f9d17f4de8dca5efeaebceddca60f90136d69fd45ce11b4802"} Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.320522 4694 scope.go:117] "RemoveContainer" containerID="1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323189 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerStarted","Data":"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14"} Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323329 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-central-agent" containerID="cri-o://627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065" gracePeriod=30 Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323360 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323370 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-notification-agent" containerID="cri-o://2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77" gracePeriod=30 Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323405 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="sg-core" containerID="cri-o://c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1" gracePeriod=30 Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.323369 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="proxy-httpd" containerID="cri-o://c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14" gracePeriod=30 Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.344564 4694 scope.go:117] "RemoveContainer" containerID="e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.362162 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs\") pod \"abc565ee-1969-40b4-874f-1b71f43a8972\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.362286 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config\") pod \"abc565ee-1969-40b4-874f-1b71f43a8972\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.362356 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh69k\" (UniqueName: \"kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k\") pod \"abc565ee-1969-40b4-874f-1b71f43a8972\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.362399 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle\") pod \"abc565ee-1969-40b4-874f-1b71f43a8972\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.362510 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config\") pod \"abc565ee-1969-40b4-874f-1b71f43a8972\" (UID: \"abc565ee-1969-40b4-874f-1b71f43a8972\") " Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.369756 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vcrmb"] Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.370442 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-api" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370454 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-api" Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.370467 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370473 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.370486 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-httpd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370492 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-httpd" Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.370503 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon-log" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370508 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon-log" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370721 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370740 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc102be-9643-4310-900a-c6f6803a395a" containerName="horizon-log" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370760 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-httpd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.370780 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" containerName="neutron-api" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.371373 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.372237 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k" (OuterVolumeSpecName: "kube-api-access-qh69k") pod "abc565ee-1969-40b4-874f-1b71f43a8972" (UID: "abc565ee-1969-40b4-874f-1b71f43a8972"). InnerVolumeSpecName "kube-api-access-qh69k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.378857 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vcrmb"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.399210 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "abc565ee-1969-40b4-874f-1b71f43a8972" (UID: "abc565ee-1969-40b4-874f-1b71f43a8972"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.414395 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.208567399 podStartE2EDuration="11.414376248s" podCreationTimestamp="2026-02-17 17:04:17 +0000 UTC" firstStartedPulling="2026-02-17 17:04:23.884445268 +0000 UTC m=+1331.641520582" lastFinishedPulling="2026-02-17 17:04:27.090254107 +0000 UTC m=+1334.847329431" observedRunningTime="2026-02-17 17:04:28.389055374 +0000 UTC m=+1336.146130698" watchObservedRunningTime="2026-02-17 17:04:28.414376248 +0000 UTC m=+1336.171451572" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.449071 4694 scope.go:117] "RemoveContainer" containerID="1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988" Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.451283 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988\": container with ID starting with 1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988 not found: ID does not exist" containerID="1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.451323 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988"} err="failed to get container status \"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988\": rpc error: code = NotFound desc = could not find container \"1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988\": container with ID starting with 1fdbc459ad1d6b646785f24659daeda0d674b6254ee7c40d3f74c7996b734988 not found: ID does not exist" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.451350 4694 scope.go:117] "RemoveContainer" containerID="e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331" Feb 17 17:04:28 crc kubenswrapper[4694]: E0217 17:04:28.456203 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331\": container with ID starting with e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331 not found: ID does not exist" containerID="e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.456272 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331"} err="failed to get container status \"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331\": rpc error: code = NotFound desc = could not find container \"e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331\": container with ID starting with e8deb11485ea22dba146e1cfc49fbe77fc727b5c1fe190ae1fc772c29fc2a331 not found: ID does not exist" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.464520 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgtv\" (UniqueName: \"kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.464592 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.464715 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.464731 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh69k\" (UniqueName: \"kubernetes.io/projected/abc565ee-1969-40b4-874f-1b71f43a8972-kube-api-access-qh69k\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.472712 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config" (OuterVolumeSpecName: "config") pod "abc565ee-1969-40b4-874f-1b71f43a8972" (UID: "abc565ee-1969-40b4-874f-1b71f43a8972"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.502546 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abc565ee-1969-40b4-874f-1b71f43a8972" (UID: "abc565ee-1969-40b4-874f-1b71f43a8972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.505870 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fq6xd"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.507417 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.543004 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fq6xd"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.548701 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "abc565ee-1969-40b4-874f-1b71f43a8972" (UID: "abc565ee-1969-40b4-874f-1b71f43a8972"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.563073 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3572-account-create-update-zzbtt"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.564663 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566000 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgtv\" (UniqueName: \"kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566061 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566126 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwzf\" (UniqueName: \"kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566167 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566230 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566246 4694 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.566256 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc565ee-1969-40b4-874f-1b71f43a8972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.567333 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.567449 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.584712 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgtv\" (UniqueName: \"kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv\") pod \"nova-api-db-create-vcrmb\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.587022 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3572-account-create-update-zzbtt"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.642741 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zgbvz"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.644079 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.652267 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zgbvz"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.669142 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwzf\" (UniqueName: \"kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.669190 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.669237 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.669266 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65x27\" (UniqueName: \"kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.670157 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.671122 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.678408 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c4845f94d-rb96f"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.685626 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwzf\" (UniqueName: \"kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf\") pod \"nova-cell0-db-create-fq6xd\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.759290 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a9d3-account-create-update-qvwcg"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.760572 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.764792 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.768265 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a9d3-account-create-update-qvwcg"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.771259 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.771316 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxpm8\" (UniqueName: \"kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.771348 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65x27\" (UniqueName: \"kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.771402 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.772154 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.791323 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65x27\" (UniqueName: \"kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27\") pod \"nova-api-3572-account-create-update-zzbtt\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.872839 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnrh\" (UniqueName: \"kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.873276 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxpm8\" (UniqueName: \"kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.873446 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.873712 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.874382 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.878305 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.888942 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxpm8\" (UniqueName: \"kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8\") pod \"nova-cell1-db-create-zgbvz\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.891909 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.900036 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.906209 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc565ee-1969-40b4-874f-1b71f43a8972" path="/var/lib/kubelet/pods/abc565ee-1969-40b4-874f-1b71f43a8972/volumes" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.954424 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-327d-account-create-update-5mzdp"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.955736 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.959575 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.961208 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.966193 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-327d-account-create-update-5mzdp"] Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.984714 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.984875 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnrh\" (UniqueName: \"kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:28 crc kubenswrapper[4694]: I0217 17:04:28.986537 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.002163 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnrh\" (UniqueName: \"kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh\") pod \"nova-cell0-a9d3-account-create-update-qvwcg\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.082430 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.086535 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.086645 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqmzk\" (UniqueName: \"kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.188293 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.188372 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqmzk\" (UniqueName: \"kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.189668 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.209748 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqmzk\" (UniqueName: \"kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk\") pod \"nova-cell1-327d-account-create-update-5mzdp\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.285874 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337271 4694 generic.go:334] "Generic (PLEG): container finished" podID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerID="c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14" exitCode=0 Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337342 4694 generic.go:334] "Generic (PLEG): container finished" podID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerID="c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1" exitCode=2 Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337353 4694 generic.go:334] "Generic (PLEG): container finished" podID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerID="2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77" exitCode=0 Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337335 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerDied","Data":"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14"} Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337404 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerDied","Data":"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1"} Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.337419 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerDied","Data":"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77"} Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.400695 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vcrmb"] Feb 17 17:04:29 crc kubenswrapper[4694]: W0217 17:04:29.505474 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99d812e0_ed6a_4704_a149_35d388dc6d9b.slice/crio-90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05 WatchSource:0}: Error finding container 90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05: Status 404 returned error can't find the container with id 90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05 Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.545508 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fq6xd"] Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.570586 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3572-account-create-update-zzbtt"] Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.627986 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zgbvz"] Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.828200 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a9d3-account-create-update-qvwcg"] Feb 17 17:04:29 crc kubenswrapper[4694]: I0217 17:04:29.949277 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-327d-account-create-update-5mzdp"] Feb 17 17:04:30 crc kubenswrapper[4694]: W0217 17:04:30.011580 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3af0afc1_594e_4c0d_a74b_6fb9266a7f57.slice/crio-73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4 WatchSource:0}: Error finding container 73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4: Status 404 returned error can't find the container with id 73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.347131 4694 generic.go:334] "Generic (PLEG): container finished" podID="f2af48f7-ad7a-4692-a40c-c6d30bbe2402" containerID="cfceb706fc3f55ab542e92a971751838abefc980efe7a7cd5d23cd248e20f6a2" exitCode=0 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.347180 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vcrmb" event={"ID":"f2af48f7-ad7a-4692-a40c-c6d30bbe2402","Type":"ContainerDied","Data":"cfceb706fc3f55ab542e92a971751838abefc980efe7a7cd5d23cd248e20f6a2"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.347222 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vcrmb" event={"ID":"f2af48f7-ad7a-4692-a40c-c6d30bbe2402","Type":"ContainerStarted","Data":"7783c0c04b984566f3f7478e72513faab55ad692c43a4a3b0bf3561584aa46b5"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.349431 4694 generic.go:334] "Generic (PLEG): container finished" podID="9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" containerID="a1e011f48b4ba8031e1e705766f1981fcec519350256b5cc05faac3f3ce1cad9" exitCode=0 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.349479 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zgbvz" event={"ID":"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0","Type":"ContainerDied","Data":"a1e011f48b4ba8031e1e705766f1981fcec519350256b5cc05faac3f3ce1cad9"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.349528 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zgbvz" event={"ID":"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0","Type":"ContainerStarted","Data":"933f994e27a3af53c805da96cdf4a07ef3cb4a4d72055aedb966bcc5ff6cc1ff"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.351120 4694 generic.go:334] "Generic (PLEG): container finished" podID="99d812e0-ed6a-4704-a149-35d388dc6d9b" containerID="88b29609efa228f8a413bf695867660c805fb47539ef92289ab3c9f246872be9" exitCode=0 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.351170 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3572-account-create-update-zzbtt" event={"ID":"99d812e0-ed6a-4704-a149-35d388dc6d9b","Type":"ContainerDied","Data":"88b29609efa228f8a413bf695867660c805fb47539ef92289ab3c9f246872be9"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.351249 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3572-account-create-update-zzbtt" event={"ID":"99d812e0-ed6a-4704-a149-35d388dc6d9b","Type":"ContainerStarted","Data":"90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.352937 4694 generic.go:334] "Generic (PLEG): container finished" podID="1c514806-c7d8-4b8f-ba4f-866d382f6d82" containerID="10cc4e59299d63d7e4085c9de930c7d20d7b910cde4706468651a75a3bfe68e8" exitCode=0 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.352994 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fq6xd" event={"ID":"1c514806-c7d8-4b8f-ba4f-866d382f6d82","Type":"ContainerDied","Data":"10cc4e59299d63d7e4085c9de930c7d20d7b910cde4706468651a75a3bfe68e8"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.353025 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fq6xd" event={"ID":"1c514806-c7d8-4b8f-ba4f-866d382f6d82","Type":"ContainerStarted","Data":"3affb678837cd81ec2a5400fc1c0e9452e045bf7ade21f8bd5bc5065e62bd55c"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.354595 4694 generic.go:334] "Generic (PLEG): container finished" podID="a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" containerID="f948a22f8d37eb66368806270f57d55d41a8d2a68c66b08027eff86ec3789c89" exitCode=0 Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.354662 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" event={"ID":"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9","Type":"ContainerDied","Data":"f948a22f8d37eb66368806270f57d55d41a8d2a68c66b08027eff86ec3789c89"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.354687 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" event={"ID":"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9","Type":"ContainerStarted","Data":"1f6b3f08fddc45ba36c37296961c5a2107a023f93399abdac1ba6a58fd86f348"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.356499 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" event={"ID":"3af0afc1-594e-4c0d-a74b-6fb9266a7f57","Type":"ContainerStarted","Data":"1277097a7fada0aeb58b66553e375879a146eaa803facec0851bdf6bbecd8211"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.356535 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" event={"ID":"3af0afc1-594e-4c0d-a74b-6fb9266a7f57","Type":"ContainerStarted","Data":"73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4"} Feb 17 17:04:30 crc kubenswrapper[4694]: I0217 17:04:30.444987 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" podStartSLOduration=2.444968084 podStartE2EDuration="2.444968084s" podCreationTimestamp="2026-02-17 17:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:30.442919764 +0000 UTC m=+1338.199995108" watchObservedRunningTime="2026-02-17 17:04:30.444968084 +0000 UTC m=+1338.202043418" Feb 17 17:04:31 crc kubenswrapper[4694]: I0217 17:04:31.364494 4694 generic.go:334] "Generic (PLEG): container finished" podID="3af0afc1-594e-4c0d-a74b-6fb9266a7f57" containerID="1277097a7fada0aeb58b66553e375879a146eaa803facec0851bdf6bbecd8211" exitCode=0 Feb 17 17:04:31 crc kubenswrapper[4694]: I0217 17:04:31.365193 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" event={"ID":"3af0afc1-594e-4c0d-a74b-6fb9266a7f57","Type":"ContainerDied","Data":"1277097a7fada0aeb58b66553e375879a146eaa803facec0851bdf6bbecd8211"} Feb 17 17:04:31 crc kubenswrapper[4694]: I0217 17:04:31.942371 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.055215 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts\") pod \"99d812e0-ed6a-4704-a149-35d388dc6d9b\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.055426 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65x27\" (UniqueName: \"kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27\") pod \"99d812e0-ed6a-4704-a149-35d388dc6d9b\" (UID: \"99d812e0-ed6a-4704-a149-35d388dc6d9b\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.058516 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99d812e0-ed6a-4704-a149-35d388dc6d9b" (UID: "99d812e0-ed6a-4704-a149-35d388dc6d9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.066160 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27" (OuterVolumeSpecName: "kube-api-access-65x27") pod "99d812e0-ed6a-4704-a149-35d388dc6d9b" (UID: "99d812e0-ed6a-4704-a149-35d388dc6d9b"). InnerVolumeSpecName "kube-api-access-65x27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.157019 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65x27\" (UniqueName: \"kubernetes.io/projected/99d812e0-ed6a-4704-a149-35d388dc6d9b-kube-api-access-65x27\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.157089 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99d812e0-ed6a-4704-a149-35d388dc6d9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.178906 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.250487 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.259492 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjwzf\" (UniqueName: \"kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf\") pod \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.259580 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts\") pod \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\" (UID: \"1c514806-c7d8-4b8f-ba4f-866d382f6d82\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.259986 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.260803 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c514806-c7d8-4b8f-ba4f-866d382f6d82" (UID: "1c514806-c7d8-4b8f-ba4f-866d382f6d82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.267450 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf" (OuterVolumeSpecName: "kube-api-access-bjwzf") pod "1c514806-c7d8-4b8f-ba4f-866d382f6d82" (UID: "1c514806-c7d8-4b8f-ba4f-866d382f6d82"). InnerVolumeSpecName "kube-api-access-bjwzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.307256 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.378797 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts\") pod \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.379174 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxpm8\" (UniqueName: \"kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8\") pod \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.379255 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnrh\" (UniqueName: \"kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh\") pod \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\" (UID: \"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.379272 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts\") pod \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\" (UID: \"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.379297 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfgtv\" (UniqueName: \"kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv\") pod \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.379373 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts\") pod \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\" (UID: \"f2af48f7-ad7a-4692-a40c-c6d30bbe2402\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.380302 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjwzf\" (UniqueName: \"kubernetes.io/projected/1c514806-c7d8-4b8f-ba4f-866d382f6d82-kube-api-access-bjwzf\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.380318 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c514806-c7d8-4b8f-ba4f-866d382f6d82-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.381112 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2af48f7-ad7a-4692-a40c-c6d30bbe2402" (UID: "f2af48f7-ad7a-4692-a40c-c6d30bbe2402"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.381306 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" (UID: "a9bd5acb-d4c7-474d-b8cb-3e39de8265b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.381802 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" (UID: "9c35aab0-3f76-440f-b2b6-eecbe7ddbff0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.387295 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh" (OuterVolumeSpecName: "kube-api-access-cwnrh") pod "a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" (UID: "a9bd5acb-d4c7-474d-b8cb-3e39de8265b9"). InnerVolumeSpecName "kube-api-access-cwnrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.388795 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8" (OuterVolumeSpecName: "kube-api-access-pxpm8") pod "9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" (UID: "9c35aab0-3f76-440f-b2b6-eecbe7ddbff0"). InnerVolumeSpecName "kube-api-access-pxpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.394259 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.396438 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv" (OuterVolumeSpecName: "kube-api-access-dfgtv") pod "f2af48f7-ad7a-4692-a40c-c6d30bbe2402" (UID: "f2af48f7-ad7a-4692-a40c-c6d30bbe2402"). InnerVolumeSpecName "kube-api-access-dfgtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.420329 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" event={"ID":"a9bd5acb-d4c7-474d-b8cb-3e39de8265b9","Type":"ContainerDied","Data":"1f6b3f08fddc45ba36c37296961c5a2107a023f93399abdac1ba6a58fd86f348"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.420400 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6b3f08fddc45ba36c37296961c5a2107a023f93399abdac1ba6a58fd86f348" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.420484 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a9d3-account-create-update-qvwcg" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.424753 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vcrmb" event={"ID":"f2af48f7-ad7a-4692-a40c-c6d30bbe2402","Type":"ContainerDied","Data":"7783c0c04b984566f3f7478e72513faab55ad692c43a4a3b0bf3561584aa46b5"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.424793 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7783c0c04b984566f3f7478e72513faab55ad692c43a4a3b0bf3561584aa46b5" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.424840 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vcrmb" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.439946 4694 generic.go:334] "Generic (PLEG): container finished" podID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerID="627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065" exitCode=0 Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.440042 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerDied","Data":"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.440069 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b82b717-82c4-42c1-897a-d953cd9f6e2e","Type":"ContainerDied","Data":"8f3b641bf769448cba1eca39639c18e14c165e7b59c42c6d269230c231e505ad"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.440086 4694 scope.go:117] "RemoveContainer" containerID="c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.440202 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.444323 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zgbvz" event={"ID":"9c35aab0-3f76-440f-b2b6-eecbe7ddbff0","Type":"ContainerDied","Data":"933f994e27a3af53c805da96cdf4a07ef3cb4a4d72055aedb966bcc5ff6cc1ff"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.444386 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="933f994e27a3af53c805da96cdf4a07ef3cb4a4d72055aedb966bcc5ff6cc1ff" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.444481 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zgbvz" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.453454 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3572-account-create-update-zzbtt" event={"ID":"99d812e0-ed6a-4704-a149-35d388dc6d9b","Type":"ContainerDied","Data":"90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.453516 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90871eccdee2ec8b0c859dad25e9de20066b99e836c949aa14aa3339eae56e05" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.453588 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3572-account-create-update-zzbtt" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.462861 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fq6xd" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.465757 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fq6xd" event={"ID":"1c514806-c7d8-4b8f-ba4f-866d382f6d82","Type":"ContainerDied","Data":"3affb678837cd81ec2a5400fc1c0e9452e045bf7ade21f8bd5bc5065e62bd55c"} Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.465804 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3affb678837cd81ec2a5400fc1c0e9452e045bf7ade21f8bd5bc5065e62bd55c" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.482720 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.482841 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.482906 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.483459 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.483491 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mscxv\" (UniqueName: \"kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.484129 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.484197 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.484224 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts\") pod \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\" (UID: \"3b82b717-82c4-42c1-897a-d953cd9f6e2e\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485243 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485289 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxpm8\" (UniqueName: \"kubernetes.io/projected/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-kube-api-access-pxpm8\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485306 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485317 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnrh\" (UniqueName: \"kubernetes.io/projected/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9-kube-api-access-cwnrh\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485330 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfgtv\" (UniqueName: \"kubernetes.io/projected/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-kube-api-access-dfgtv\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485342 4694 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.485377 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2af48f7-ad7a-4692-a40c-c6d30bbe2402-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.486966 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv" (OuterVolumeSpecName: "kube-api-access-mscxv") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "kube-api-access-mscxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.487629 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.489334 4694 scope.go:117] "RemoveContainer" containerID="c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.489346 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts" (OuterVolumeSpecName: "scripts") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.520417 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.568923 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.586480 4694 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b82b717-82c4-42c1-897a-d953cd9f6e2e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.586500 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.586509 4694 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.586519 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mscxv\" (UniqueName: \"kubernetes.io/projected/3b82b717-82c4-42c1-897a-d953cd9f6e2e-kube-api-access-mscxv\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.603323 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.614594 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data" (OuterVolumeSpecName: "config-data") pod "3b82b717-82c4-42c1-897a-d953cd9f6e2e" (UID: "3b82b717-82c4-42c1-897a-d953cd9f6e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.637819 4694 scope.go:117] "RemoveContainer" containerID="2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.674968 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7569cc6b-bv5js" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.684026 4694 scope.go:117] "RemoveContainer" containerID="627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.688115 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.688138 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b82b717-82c4-42c1-897a-d953cd9f6e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.694804 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2af48f7_ad7a_4692_a40c_c6d30bbe2402.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.723129 4694 scope.go:117] "RemoveContainer" containerID="c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.728775 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14\": container with ID starting with c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14 not found: ID does not exist" containerID="c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.728829 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14"} err="failed to get container status \"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14\": rpc error: code = NotFound desc = could not find container \"c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14\": container with ID starting with c81e4eaf4c2bdd20b5effec11c452a318ed727a93700cd171127b315e4e5ff14 not found: ID does not exist" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.728864 4694 scope.go:117] "RemoveContainer" containerID="c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.729798 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1\": container with ID starting with c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1 not found: ID does not exist" containerID="c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.729828 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1"} err="failed to get container status \"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1\": rpc error: code = NotFound desc = could not find container \"c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1\": container with ID starting with c1734de3cc95f6e1f911137761f64214b68710f5357df995804e054e9de1c6c1 not found: ID does not exist" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.729847 4694 scope.go:117] "RemoveContainer" containerID="2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.733521 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77\": container with ID starting with 2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77 not found: ID does not exist" containerID="2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.733571 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77"} err="failed to get container status \"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77\": rpc error: code = NotFound desc = could not find container \"2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77\": container with ID starting with 2288d75764ab32909834d0dfbe3a6f1b8be2df936fa2f188c050594e28135d77 not found: ID does not exist" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.733601 4694 scope.go:117] "RemoveContainer" containerID="627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.736946 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065\": container with ID starting with 627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065 not found: ID does not exist" containerID="627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.737013 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065"} err="failed to get container status \"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065\": rpc error: code = NotFound desc = could not find container \"627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065\": container with ID starting with 627e8d34f23b7b6269fcf4d3c187cb12c1d88709046bc0adf6d09c99e588e065 not found: ID does not exist" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.762404 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.762980 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59959cfcd4-nr2rp" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-log" containerID="cri-o://4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f" gracePeriod=30 Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.763141 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59959cfcd4-nr2rp" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-api" containerID="cri-o://d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8" gracePeriod=30 Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.801589 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.843326 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.865782 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.893671 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894042 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d812e0-ed6a-4704-a149-35d388dc6d9b" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894059 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d812e0-ed6a-4704-a149-35d388dc6d9b" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894068 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894074 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894101 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-central-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894110 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-central-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894122 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c514806-c7d8-4b8f-ba4f-866d382f6d82" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894131 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c514806-c7d8-4b8f-ba4f-866d382f6d82" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894158 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af0afc1-594e-4c0d-a74b-6fb9266a7f57" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894165 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af0afc1-594e-4c0d-a74b-6fb9266a7f57" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894175 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="proxy-httpd" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894183 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="proxy-httpd" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894195 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="sg-core" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894203 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="sg-core" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894217 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-notification-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894224 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-notification-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894233 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2af48f7-ad7a-4692-a40c-c6d30bbe2402" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894239 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2af48f7-ad7a-4692-a40c-c6d30bbe2402" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: E0217 17:04:32.894248 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894254 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894420 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="proxy-httpd" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894435 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894442 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d812e0-ed6a-4704-a149-35d388dc6d9b" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894450 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c514806-c7d8-4b8f-ba4f-866d382f6d82" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894462 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="sg-core" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894476 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2af48f7-ad7a-4692-a40c-c6d30bbe2402" containerName="mariadb-database-create" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894484 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894495 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-central-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894506 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af0afc1-594e-4c0d-a74b-6fb9266a7f57" containerName="mariadb-account-create-update" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.894512 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" containerName="ceilometer-notification-agent" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.896152 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.900219 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts\") pod \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.900402 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqmzk\" (UniqueName: \"kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk\") pod \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\" (UID: \"3af0afc1-594e-4c0d-a74b-6fb9266a7f57\") " Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.900862 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af0afc1-594e-4c0d-a74b-6fb9266a7f57" (UID: "3af0afc1-594e-4c0d-a74b-6fb9266a7f57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.901998 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.902261 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.914096 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk" (OuterVolumeSpecName: "kube-api-access-rqmzk") pod "3af0afc1-594e-4c0d-a74b-6fb9266a7f57" (UID: "3af0afc1-594e-4c0d-a74b-6fb9266a7f57"). InnerVolumeSpecName "kube-api-access-rqmzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.915986 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b82b717-82c4-42c1-897a-d953cd9f6e2e" path="/var/lib/kubelet/pods/3b82b717-82c4-42c1-897a-d953cd9f6e2e/volumes" Feb 17 17:04:32 crc kubenswrapper[4694]: I0217 17:04:32.917135 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002155 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002211 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002252 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002274 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002336 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002388 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002417 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlllt\" (UniqueName: \"kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002482 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqmzk\" (UniqueName: \"kubernetes.io/projected/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-kube-api-access-rqmzk\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.002493 4694 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af0afc1-594e-4c0d-a74b-6fb9266a7f57-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.103989 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104040 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104069 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104109 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104134 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlllt\" (UniqueName: \"kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104207 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.104235 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.105715 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.105989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.110057 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.110185 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.111847 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.119503 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.121940 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlllt\" (UniqueName: \"kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt\") pod \"ceilometer-0\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.290115 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.475728 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" event={"ID":"3af0afc1-594e-4c0d-a74b-6fb9266a7f57","Type":"ContainerDied","Data":"73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4"} Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.475768 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ba6c86387844a38bdfd7128ff34eb88e420be917e641348e541e413c1c1de4" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.475820 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-327d-account-create-update-5mzdp" Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.481147 4694 generic.go:334] "Generic (PLEG): container finished" podID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerID="4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f" exitCode=143 Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.481655 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerDied","Data":"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f"} Feb 17 17:04:33 crc kubenswrapper[4694]: I0217 17:04:33.752825 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.053796 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fpg7"] Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.055148 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.058047 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.058287 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dg5ch" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.058298 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.076067 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fpg7"] Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.125354 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.125405 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.125446 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjcl\" (UniqueName: \"kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.125564 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.227461 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.227539 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.227593 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjcl\" (UniqueName: \"kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.227646 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.231668 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.232372 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.235096 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.247098 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjcl\" (UniqueName: \"kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl\") pod \"nova-cell0-conductor-db-sync-8fpg7\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.381402 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.496012 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerStarted","Data":"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15"} Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.496449 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerStarted","Data":"00eafaed39d3886045095a2c22bbc5831a2feba0bff6c59153c2a9b9894a43a4"} Feb 17 17:04:34 crc kubenswrapper[4694]: W0217 17:04:34.817357 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod146d2c58_f359_4372_810a_7ab64e022ad1.slice/crio-99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3 WatchSource:0}: Error finding container 99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3: Status 404 returned error can't find the container with id 99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3 Feb 17 17:04:34 crc kubenswrapper[4694]: I0217 17:04:34.831154 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fpg7"] Feb 17 17:04:35 crc kubenswrapper[4694]: I0217 17:04:35.506231 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerStarted","Data":"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952"} Feb 17 17:04:35 crc kubenswrapper[4694]: I0217 17:04:35.507445 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" event={"ID":"146d2c58-f359-4372-810a-7ab64e022ad1","Type":"ContainerStarted","Data":"99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3"} Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.379671 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468514 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468660 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468735 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmdr\" (UniqueName: \"kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468813 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468870 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468890 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.468919 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs\") pod \"58c86e79-4506-4f20-83e3-1e7e85c07c80\" (UID: \"58c86e79-4506-4f20-83e3-1e7e85c07c80\") " Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.470330 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs" (OuterVolumeSpecName: "logs") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.474828 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr" (OuterVolumeSpecName: "kube-api-access-bdmdr") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "kube-api-access-bdmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.477720 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts" (OuterVolumeSpecName: "scripts") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.522203 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerStarted","Data":"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc"} Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.524737 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data" (OuterVolumeSpecName: "config-data") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.525307 4694 generic.go:334] "Generic (PLEG): container finished" podID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerID="d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8" exitCode=0 Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.525338 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerDied","Data":"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8"} Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.525359 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59959cfcd4-nr2rp" event={"ID":"58c86e79-4506-4f20-83e3-1e7e85c07c80","Type":"ContainerDied","Data":"c95e5374e43ec984f9bc4f93240403d36834e18ce5da4bdb616dd26eeeefab16"} Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.525376 4694 scope.go:117] "RemoveContainer" containerID="d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.525496 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59959cfcd4-nr2rp" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.550905 4694 scope.go:117] "RemoveContainer" containerID="4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.560956 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.570926 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.570954 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.570964 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmdr\" (UniqueName: \"kubernetes.io/projected/58c86e79-4506-4f20-83e3-1e7e85c07c80-kube-api-access-bdmdr\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.570974 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58c86e79-4506-4f20-83e3-1e7e85c07c80-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.570982 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.577101 4694 scope.go:117] "RemoveContainer" containerID="d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8" Feb 17 17:04:36 crc kubenswrapper[4694]: E0217 17:04:36.577740 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8\": container with ID starting with d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8 not found: ID does not exist" containerID="d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.577792 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8"} err="failed to get container status \"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8\": rpc error: code = NotFound desc = could not find container \"d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8\": container with ID starting with d84259e919991a12c82c5696002f94071cf20870033b6494a64e4589442b75c8 not found: ID does not exist" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.577826 4694 scope.go:117] "RemoveContainer" containerID="4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f" Feb 17 17:04:36 crc kubenswrapper[4694]: E0217 17:04:36.579322 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f\": container with ID starting with 4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f not found: ID does not exist" containerID="4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.579380 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f"} err="failed to get container status \"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f\": rpc error: code = NotFound desc = could not find container \"4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f\": container with ID starting with 4dbde331aa69350cb9e8064c3414f0aee7f503cafddc37318f5e7b5b4229aa9f not found: ID does not exist" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.586845 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.592698 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58c86e79-4506-4f20-83e3-1e7e85c07c80" (UID: "58c86e79-4506-4f20-83e3-1e7e85c07c80"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.672583 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.672655 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58c86e79-4506-4f20-83e3-1e7e85c07c80-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.893986 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:04:36 crc kubenswrapper[4694]: I0217 17:04:36.915209 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-59959cfcd4-nr2rp"] Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.436924 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540164 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerStarted","Data":"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f"} Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540283 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="proxy-httpd" containerID="cri-o://99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f" gracePeriod=30 Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540326 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540283 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="sg-core" containerID="cri-o://14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc" gracePeriod=30 Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540283 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-central-agent" containerID="cri-o://ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15" gracePeriod=30 Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.540451 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-notification-agent" containerID="cri-o://1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952" gracePeriod=30 Feb 17 17:04:37 crc kubenswrapper[4694]: I0217 17:04:37.563878 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037971013 podStartE2EDuration="5.563859697s" podCreationTimestamp="2026-02-17 17:04:32 +0000 UTC" firstStartedPulling="2026-02-17 17:04:33.752472479 +0000 UTC m=+1341.509547803" lastFinishedPulling="2026-02-17 17:04:37.278361163 +0000 UTC m=+1345.035436487" observedRunningTime="2026-02-17 17:04:37.560073984 +0000 UTC m=+1345.317149308" watchObservedRunningTime="2026-02-17 17:04:37.563859697 +0000 UTC m=+1345.320935021" Feb 17 17:04:38 crc kubenswrapper[4694]: I0217 17:04:38.557592 4694 generic.go:334] "Generic (PLEG): container finished" podID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerID="14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc" exitCode=2 Feb 17 17:04:38 crc kubenswrapper[4694]: I0217 17:04:38.557986 4694 generic.go:334] "Generic (PLEG): container finished" podID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerID="1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952" exitCode=0 Feb 17 17:04:38 crc kubenswrapper[4694]: I0217 17:04:38.557703 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerDied","Data":"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc"} Feb 17 17:04:38 crc kubenswrapper[4694]: I0217 17:04:38.558864 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerDied","Data":"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952"} Feb 17 17:04:38 crc kubenswrapper[4694]: I0217 17:04:38.915628 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" path="/var/lib/kubelet/pods/58c86e79-4506-4f20-83e3-1e7e85c07c80/volumes" Feb 17 17:04:41 crc kubenswrapper[4694]: I0217 17:04:41.791899 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:41 crc kubenswrapper[4694]: I0217 17:04:41.792801 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-log" containerID="cri-o://7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842" gracePeriod=30 Feb 17 17:04:41 crc kubenswrapper[4694]: I0217 17:04:41.792974 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-httpd" containerID="cri-o://241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe" gracePeriod=30 Feb 17 17:04:42 crc kubenswrapper[4694]: I0217 17:04:42.606653 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" event={"ID":"146d2c58-f359-4372-810a-7ab64e022ad1","Type":"ContainerStarted","Data":"de510ba0bf86b421cacbe4e0c732310ef7108bd7e3abfe754f209c8de1f23fef"} Feb 17 17:04:42 crc kubenswrapper[4694]: I0217 17:04:42.609643 4694 generic.go:334] "Generic (PLEG): container finished" podID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerID="7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842" exitCode=143 Feb 17 17:04:42 crc kubenswrapper[4694]: I0217 17:04:42.609672 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerDied","Data":"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842"} Feb 17 17:04:42 crc kubenswrapper[4694]: E0217 17:04:42.908317 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd525bca6_b587_4bac_a01b_d9b410ad69f6.slice/crio-ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.465665 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" podStartSLOduration=2.044081125 podStartE2EDuration="9.465650285s" podCreationTimestamp="2026-02-17 17:04:34 +0000 UTC" firstStartedPulling="2026-02-17 17:04:34.830552699 +0000 UTC m=+1342.587628023" lastFinishedPulling="2026-02-17 17:04:42.252121859 +0000 UTC m=+1350.009197183" observedRunningTime="2026-02-17 17:04:42.6324961 +0000 UTC m=+1350.389571434" watchObservedRunningTime="2026-02-17 17:04:43.465650285 +0000 UTC m=+1351.222725609" Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.472700 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.472945 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-log" containerID="cri-o://2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30" gracePeriod=30 Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.473030 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-httpd" containerID="cri-o://b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1" gracePeriod=30 Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.622245 4694 generic.go:334] "Generic (PLEG): container finished" podID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerID="2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30" exitCode=143 Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.622306 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerDied","Data":"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30"} Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.625874 4694 generic.go:334] "Generic (PLEG): container finished" podID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerID="ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15" exitCode=0 Feb 17 17:04:43 crc kubenswrapper[4694]: I0217 17:04:43.626707 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerDied","Data":"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15"} Feb 17 17:04:44 crc kubenswrapper[4694]: I0217 17:04:44.617998 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:04:44 crc kubenswrapper[4694]: I0217 17:04:44.618078 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.494120 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.550851 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.550968 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.550990 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.551123 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsjlh\" (UniqueName: \"kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.551160 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.551179 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.551202 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.551217 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"15056d4d-99d7-4c45-bd24-8141aeca9791\" (UID: \"15056d4d-99d7-4c45-bd24-8141aeca9791\") " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.552097 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs" (OuterVolumeSpecName: "logs") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.552121 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.562275 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.569867 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh" (OuterVolumeSpecName: "kube-api-access-jsjlh") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "kube-api-access-jsjlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.574672 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts" (OuterVolumeSpecName: "scripts") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.613897 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.645718 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data" (OuterVolumeSpecName: "config-data") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.647664 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15056d4d-99d7-4c45-bd24-8141aeca9791" (UID: "15056d4d-99d7-4c45-bd24-8141aeca9791"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.652903 4694 generic.go:334] "Generic (PLEG): container finished" podID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerID="241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe" exitCode=0 Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.653083 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerDied","Data":"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe"} Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.653201 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.653205 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15056d4d-99d7-4c45-bd24-8141aeca9791","Type":"ContainerDied","Data":"450afac0a7ba73c8ff738a817d80e92f425f23ba7d452583562ba78ddb84797f"} Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.653219 4694 scope.go:117] "RemoveContainer" containerID="241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654402 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsjlh\" (UniqueName: \"kubernetes.io/projected/15056d4d-99d7-4c45-bd24-8141aeca9791-kube-api-access-jsjlh\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654474 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654543 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654689 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654770 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654835 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654892 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15056d4d-99d7-4c45-bd24-8141aeca9791-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.654952 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15056d4d-99d7-4c45-bd24-8141aeca9791-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.689713 4694 scope.go:117] "RemoveContainer" containerID="7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.694297 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.700718 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.708968 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.710413 4694 scope.go:117] "RemoveContainer" containerID="241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe" Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.711874 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe\": container with ID starting with 241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe not found: ID does not exist" containerID="241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.711915 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe"} err="failed to get container status \"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe\": rpc error: code = NotFound desc = could not find container \"241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe\": container with ID starting with 241ca61cfc2e34528ece58ae8f3bccedc2092e44375caece0511992482b875fe not found: ID does not exist" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.711956 4694 scope.go:117] "RemoveContainer" containerID="7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842" Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.712325 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842\": container with ID starting with 7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842 not found: ID does not exist" containerID="7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.712355 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842"} err="failed to get container status \"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842\": rpc error: code = NotFound desc = could not find container \"7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842\": container with ID starting with 7aab0e484b0939851846ba2601742113bc649a302ab928a8fce9274de7c79842 not found: ID does not exist" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721025 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.721425 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-log" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721439 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-log" Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.721466 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-log" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721473 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-log" Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.721484 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-api" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721490 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-api" Feb 17 17:04:45 crc kubenswrapper[4694]: E0217 17:04:45.721502 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-httpd" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721507 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-httpd" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721694 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-log" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721715 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-log" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721725 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" containerName="glance-httpd" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.721737 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c86e79-4506-4f20-83e3-1e7e85c07c80" containerName="placement-api" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.722646 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.724751 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.725849 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.729892 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759145 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759215 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759258 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759289 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-logs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759354 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759387 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t26s4\" (UniqueName: \"kubernetes.io/projected/b48a41b5-4a74-4883-a067-660e674ceecb-kube-api-access-t26s4\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759434 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759504 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.759731 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.792762 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.860950 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-logs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861031 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861064 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t26s4\" (UniqueName: \"kubernetes.io/projected/b48a41b5-4a74-4883-a067-660e674ceecb-kube-api-access-t26s4\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861098 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861155 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861209 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861243 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.861749 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.862146 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48a41b5-4a74-4883-a067-660e674ceecb-logs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.864712 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.864803 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.865280 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-scripts\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.865583 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48a41b5-4a74-4883-a067-660e674ceecb-config-data\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:45 crc kubenswrapper[4694]: I0217 17:04:45.878742 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t26s4\" (UniqueName: \"kubernetes.io/projected/b48a41b5-4a74-4883-a067-660e674ceecb-kube-api-access-t26s4\") pod \"glance-default-external-api-0\" (UID: \"b48a41b5-4a74-4883-a067-660e674ceecb\") " pod="openstack/glance-default-external-api-0" Feb 17 17:04:46 crc kubenswrapper[4694]: I0217 17:04:46.040660 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 17:04:46 crc kubenswrapper[4694]: I0217 17:04:46.630171 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 17:04:46 crc kubenswrapper[4694]: I0217 17:04:46.662190 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b48a41b5-4a74-4883-a067-660e674ceecb","Type":"ContainerStarted","Data":"efbe51235221e2823d523728b95454acc92dc0dd9eace7bcb36756a79ca6a946"} Feb 17 17:04:46 crc kubenswrapper[4694]: I0217 17:04:46.913291 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15056d4d-99d7-4c45-bd24-8141aeca9791" path="/var/lib/kubelet/pods/15056d4d-99d7-4c45-bd24-8141aeca9791/volumes" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.241522 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309191 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309252 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d59j\" (UniqueName: \"kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309307 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309347 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309383 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309411 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309540 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.309574 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs\") pod \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\" (UID: \"5511f1e0-fc43-4f6f-81d4-8eb5655aea61\") " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.310017 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.310500 4694 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.310881 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs" (OuterVolumeSpecName: "logs") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.319818 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.320035 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j" (OuterVolumeSpecName: "kube-api-access-9d59j") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "kube-api-access-9d59j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.348217 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.356034 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts" (OuterVolumeSpecName: "scripts") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.412282 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.412320 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d59j\" (UniqueName: \"kubernetes.io/projected/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-kube-api-access-9d59j\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.412333 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.412366 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.412382 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.453127 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data" (OuterVolumeSpecName: "config-data") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.455659 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.489483 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5511f1e0-fc43-4f6f-81d4-8eb5655aea61" (UID: "5511f1e0-fc43-4f6f-81d4-8eb5655aea61"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.514190 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.514226 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5511f1e0-fc43-4f6f-81d4-8eb5655aea61-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.514237 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.672048 4694 generic.go:334] "Generic (PLEG): container finished" podID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerID="b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1" exitCode=0 Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.672109 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerDied","Data":"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1"} Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.672139 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5511f1e0-fc43-4f6f-81d4-8eb5655aea61","Type":"ContainerDied","Data":"5d17b79f6f0ca0e20ff28597c2cb8e5c9c3fa37de23b92f7cc706f85b5d8f93c"} Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.672157 4694 scope.go:117] "RemoveContainer" containerID="b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.672300 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.676834 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b48a41b5-4a74-4883-a067-660e674ceecb","Type":"ContainerStarted","Data":"88d5d88f7d278ae742257fe041f95cdcfd9a925efbd42fb60ba870b02f2c49bc"} Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.707102 4694 scope.go:117] "RemoveContainer" containerID="2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.716949 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.730463 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.739228 4694 scope.go:117] "RemoveContainer" containerID="b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1" Feb 17 17:04:47 crc kubenswrapper[4694]: E0217 17:04:47.739557 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1\": container with ID starting with b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1 not found: ID does not exist" containerID="b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.739582 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1"} err="failed to get container status \"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1\": rpc error: code = NotFound desc = could not find container \"b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1\": container with ID starting with b60503e3a33be7a6d3750cc85a844d2667d90ba03b90a5b090c79150d1624ed1 not found: ID does not exist" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.739601 4694 scope.go:117] "RemoveContainer" containerID="2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30" Feb 17 17:04:47 crc kubenswrapper[4694]: E0217 17:04:47.740019 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30\": container with ID starting with 2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30 not found: ID does not exist" containerID="2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.740041 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30"} err="failed to get container status \"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30\": rpc error: code = NotFound desc = could not find container \"2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30\": container with ID starting with 2fbfce13a2f9a6db2a924ea50f6ef5cb59e95620f12e10541970a91d3cad9c30 not found: ID does not exist" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.745819 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:47 crc kubenswrapper[4694]: E0217 17:04:47.746237 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-httpd" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.746253 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-httpd" Feb 17 17:04:47 crc kubenswrapper[4694]: E0217 17:04:47.746293 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-log" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.746300 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-log" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.746780 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-httpd" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.746798 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" containerName="glance-log" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.747892 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.754123 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.754503 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.765299 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.818502 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.818587 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.818721 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.818828 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.818865 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.819008 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-logs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.819044 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxk4\" (UniqueName: \"kubernetes.io/projected/e061fc0e-dd5f-429f-8275-0a744dfc846d-kube-api-access-gnxk4\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.819063 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.920967 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921041 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921092 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921122 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921205 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-logs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921235 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxk4\" (UniqueName: \"kubernetes.io/projected/e061fc0e-dd5f-429f-8275-0a744dfc846d-kube-api-access-gnxk4\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921258 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.921319 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.922223 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-logs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.922233 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.922256 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e061fc0e-dd5f-429f-8275-0a744dfc846d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.926598 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.926903 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.927804 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.928482 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e061fc0e-dd5f-429f-8275-0a744dfc846d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.941285 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxk4\" (UniqueName: \"kubernetes.io/projected/e061fc0e-dd5f-429f-8275-0a744dfc846d-kube-api-access-gnxk4\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:47 crc kubenswrapper[4694]: I0217 17:04:47.957401 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"e061fc0e-dd5f-429f-8275-0a744dfc846d\") " pod="openstack/glance-default-internal-api-0" Feb 17 17:04:48 crc kubenswrapper[4694]: I0217 17:04:48.076310 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:48 crc kubenswrapper[4694]: I0217 17:04:48.689426 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b48a41b5-4a74-4883-a067-660e674ceecb","Type":"ContainerStarted","Data":"f7fcc67639fa552de00031a5769e7523eae282eb79deaf9c78f0c1c073a9bbd4"} Feb 17 17:04:48 crc kubenswrapper[4694]: W0217 17:04:48.695273 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode061fc0e_dd5f_429f_8275_0a744dfc846d.slice/crio-9c65aa3273c1b2ce982bdaea0675cdd9a60a6eec790b3203d318a5a60160974b WatchSource:0}: Error finding container 9c65aa3273c1b2ce982bdaea0675cdd9a60a6eec790b3203d318a5a60160974b: Status 404 returned error can't find the container with id 9c65aa3273c1b2ce982bdaea0675cdd9a60a6eec790b3203d318a5a60160974b Feb 17 17:04:48 crc kubenswrapper[4694]: I0217 17:04:48.702147 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 17:04:48 crc kubenswrapper[4694]: I0217 17:04:48.741884 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.741864612 podStartE2EDuration="3.741864612s" podCreationTimestamp="2026-02-17 17:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:48.735435943 +0000 UTC m=+1356.492511277" watchObservedRunningTime="2026-02-17 17:04:48.741864612 +0000 UTC m=+1356.498939926" Feb 17 17:04:48 crc kubenswrapper[4694]: I0217 17:04:48.906826 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5511f1e0-fc43-4f6f-81d4-8eb5655aea61" path="/var/lib/kubelet/pods/5511f1e0-fc43-4f6f-81d4-8eb5655aea61/volumes" Feb 17 17:04:49 crc kubenswrapper[4694]: I0217 17:04:49.706435 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e061fc0e-dd5f-429f-8275-0a744dfc846d","Type":"ContainerStarted","Data":"14c85c98b73961bf2127364c2736a54a9ea200eedc75f73854fadd6ae169fbe5"} Feb 17 17:04:49 crc kubenswrapper[4694]: I0217 17:04:49.706799 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e061fc0e-dd5f-429f-8275-0a744dfc846d","Type":"ContainerStarted","Data":"b1e7cf64c0664429787224c31bde3be9eaeefe132adb760cf90dacd5110e8981"} Feb 17 17:04:49 crc kubenswrapper[4694]: I0217 17:04:49.706816 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e061fc0e-dd5f-429f-8275-0a744dfc846d","Type":"ContainerStarted","Data":"9c65aa3273c1b2ce982bdaea0675cdd9a60a6eec790b3203d318a5a60160974b"} Feb 17 17:04:49 crc kubenswrapper[4694]: I0217 17:04:49.726722 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.726704625 podStartE2EDuration="2.726704625s" podCreationTimestamp="2026-02-17 17:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:49.724760697 +0000 UTC m=+1357.481836021" watchObservedRunningTime="2026-02-17 17:04:49.726704625 +0000 UTC m=+1357.483779949" Feb 17 17:04:53 crc kubenswrapper[4694]: I0217 17:04:53.744166 4694 generic.go:334] "Generic (PLEG): container finished" podID="146d2c58-f359-4372-810a-7ab64e022ad1" containerID="de510ba0bf86b421cacbe4e0c732310ef7108bd7e3abfe754f209c8de1f23fef" exitCode=0 Feb 17 17:04:53 crc kubenswrapper[4694]: I0217 17:04:53.744350 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" event={"ID":"146d2c58-f359-4372-810a-7ab64e022ad1","Type":"ContainerDied","Data":"de510ba0bf86b421cacbe4e0c732310ef7108bd7e3abfe754f209c8de1f23fef"} Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.101974 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.255350 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle\") pod \"146d2c58-f359-4372-810a-7ab64e022ad1\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.255416 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data\") pod \"146d2c58-f359-4372-810a-7ab64e022ad1\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.255488 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts\") pod \"146d2c58-f359-4372-810a-7ab64e022ad1\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.255543 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjcl\" (UniqueName: \"kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl\") pod \"146d2c58-f359-4372-810a-7ab64e022ad1\" (UID: \"146d2c58-f359-4372-810a-7ab64e022ad1\") " Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.261790 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts" (OuterVolumeSpecName: "scripts") pod "146d2c58-f359-4372-810a-7ab64e022ad1" (UID: "146d2c58-f359-4372-810a-7ab64e022ad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.265777 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl" (OuterVolumeSpecName: "kube-api-access-5jjcl") pod "146d2c58-f359-4372-810a-7ab64e022ad1" (UID: "146d2c58-f359-4372-810a-7ab64e022ad1"). InnerVolumeSpecName "kube-api-access-5jjcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.287572 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "146d2c58-f359-4372-810a-7ab64e022ad1" (UID: "146d2c58-f359-4372-810a-7ab64e022ad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.288912 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data" (OuterVolumeSpecName: "config-data") pod "146d2c58-f359-4372-810a-7ab64e022ad1" (UID: "146d2c58-f359-4372-810a-7ab64e022ad1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.357818 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.357852 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.357863 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146d2c58-f359-4372-810a-7ab64e022ad1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.357876 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjcl\" (UniqueName: \"kubernetes.io/projected/146d2c58-f359-4372-810a-7ab64e022ad1-kube-api-access-5jjcl\") on node \"crc\" DevicePath \"\"" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.763447 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" event={"ID":"146d2c58-f359-4372-810a-7ab64e022ad1","Type":"ContainerDied","Data":"99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3"} Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.763538 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99aff341abac15b0cdb0984764f73a6256cb76438389561dccf2439d48579ab3" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.763483 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8fpg7" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.860151 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 17:04:55 crc kubenswrapper[4694]: E0217 17:04:55.860503 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d2c58-f359-4372-810a-7ab64e022ad1" containerName="nova-cell0-conductor-db-sync" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.860519 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d2c58-f359-4372-810a-7ab64e022ad1" containerName="nova-cell0-conductor-db-sync" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.860720 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="146d2c58-f359-4372-810a-7ab64e022ad1" containerName="nova-cell0-conductor-db-sync" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.861279 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.864232 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dg5ch" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.866974 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsggd\" (UniqueName: \"kubernetes.io/projected/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-kube-api-access-jsggd\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.867207 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.867241 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.867479 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.872340 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.969369 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsggd\" (UniqueName: \"kubernetes.io/projected/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-kube-api-access-jsggd\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.969585 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.969662 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.974407 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.974595 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:55 crc kubenswrapper[4694]: I0217 17:04:55.989759 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsggd\" (UniqueName: \"kubernetes.io/projected/6d7b20d4-ec67-4732-bb23-97f5dacf1af1-kube-api-access-jsggd\") pod \"nova-cell0-conductor-0\" (UID: \"6d7b20d4-ec67-4732-bb23-97f5dacf1af1\") " pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.041399 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.041466 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.079442 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.086207 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.180309 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.590782 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 17:04:56 crc kubenswrapper[4694]: W0217 17:04:56.596309 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d7b20d4_ec67_4732_bb23_97f5dacf1af1.slice/crio-4970f8412b5ec85c146da0c60215fd436730a0bb5d1e84132dfee8a36c7848fc WatchSource:0}: Error finding container 4970f8412b5ec85c146da0c60215fd436730a0bb5d1e84132dfee8a36c7848fc: Status 404 returned error can't find the container with id 4970f8412b5ec85c146da0c60215fd436730a0bb5d1e84132dfee8a36c7848fc Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.773548 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d7b20d4-ec67-4732-bb23-97f5dacf1af1","Type":"ContainerStarted","Data":"4970f8412b5ec85c146da0c60215fd436730a0bb5d1e84132dfee8a36c7848fc"} Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.773942 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 17:04:56 crc kubenswrapper[4694]: I0217 17:04:56.773981 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 17:04:57 crc kubenswrapper[4694]: I0217 17:04:57.789388 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d7b20d4-ec67-4732-bb23-97f5dacf1af1","Type":"ContainerStarted","Data":"12ad50527343eedd13268d47093e2226e64f6aa1f421e23eeff8fce59ea53f60"} Feb 17 17:04:57 crc kubenswrapper[4694]: I0217 17:04:57.791003 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 17:04:57 crc kubenswrapper[4694]: I0217 17:04:57.812253 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.812223121 podStartE2EDuration="2.812223121s" podCreationTimestamp="2026-02-17 17:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:04:57.807886064 +0000 UTC m=+1365.564961428" watchObservedRunningTime="2026-02-17 17:04:57.812223121 +0000 UTC m=+1365.569298485" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.076486 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.076989 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.148596 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.153404 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.650960 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.653416 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.799570 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 17:04:58 crc kubenswrapper[4694]: I0217 17:04:58.799931 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 17:05:00 crc kubenswrapper[4694]: I0217 17:05:00.921988 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 17:05:00 crc kubenswrapper[4694]: I0217 17:05:00.923118 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.206549 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.776898 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4tzgw"] Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.780172 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.786400 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.791485 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.807340 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4tzgw"] Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.883946 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.884625 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.884674 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.884701 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5kgn\" (UniqueName: \"kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.923547 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.927466 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.929453 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.941291 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986586 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986669 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986691 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986707 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5kgn\" (UniqueName: \"kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986765 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9g8r\" (UniqueName: \"kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986792 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986809 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.986931 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.992864 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:01 crc kubenswrapper[4694]: I0217 17:05:01.998079 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.005815 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.007046 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.008082 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.010517 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.016261 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5kgn\" (UniqueName: \"kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn\") pod \"nova-cell0-cell-mapping-4tzgw\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.031879 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.083483 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.085146 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.087170 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088348 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088400 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088453 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9g8r\" (UniqueName: \"kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088482 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088565 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088594 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqlm2\" (UniqueName: \"kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.088634 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.090058 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.095442 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.106726 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.107050 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.110054 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.130389 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9g8r\" (UniqueName: \"kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r\") pod \"nova-api-0\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.181804 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.183740 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.189008 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.189845 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.189908 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqlm2\" (UniqueName: \"kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.189932 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.189986 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.190038 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.190067 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpcth\" (UniqueName: \"kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.197548 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.204383 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.212313 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.238248 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqlm2\" (UniqueName: \"kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2\") pod \"nova-scheduler-0\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.250258 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.292644 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293451 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpcth\" (UniqueName: \"kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293478 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293545 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293587 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293629 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293677 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2gq\" (UniqueName: \"kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.293719 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.303139 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.307205 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.315519 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.317165 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.319764 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpcth\" (UniqueName: \"kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth\") pod \"nova-cell1-novncproxy-0\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.334649 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.335087 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395203 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395275 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395304 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395350 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395377 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395410 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2gq\" (UniqueName: \"kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395508 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395551 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395643 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.395717 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758ff\" (UniqueName: \"kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.396184 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.405132 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.405736 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.417300 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2gq\" (UniqueName: \"kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq\") pod \"nova-metadata-0\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498406 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498796 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498839 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-758ff\" (UniqueName: \"kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498867 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498903 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.498936 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.500028 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.500537 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.501041 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.503299 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.503554 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.543152 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-758ff\" (UniqueName: \"kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff\") pod \"dnsmasq-dns-865f5d856f-8wkzq\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.672874 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.692870 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.715534 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4tzgw"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.855767 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kpsxt"] Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.857949 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.862054 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.862341 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 17:05:02 crc kubenswrapper[4694]: I0217 17:05:02.868040 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kpsxt"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.906469 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd9zk\" (UniqueName: \"kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.906775 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.906847 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.906877 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.950984 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.973391 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2fba692a-e3f2-413e-9493-c401862d4626","Type":"ContainerStarted","Data":"6975d049100c1567a326a3c3c33c510cd07053c4544ae9eaca801bbd11a2a303"} Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:02.974638 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4tzgw" event={"ID":"16a2de30-5a77-4179-b166-fcc003c41c17","Type":"ContainerStarted","Data":"b37fdb9985cafa3bd41de71abed44d0663bda06a1610744d81da31a2c450c495"} Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.010567 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.010897 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd9zk\" (UniqueName: \"kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.010952 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.011098 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.019551 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.020105 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.030798 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.043783 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd9zk\" (UniqueName: \"kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk\") pod \"nova-cell1-conductor-db-sync-kpsxt\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.044498 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.204230 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.210573 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.300603 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 17:05:03 crc kubenswrapper[4694]: W0217 17:05:03.340291 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74173fce_ed52_4c41_9285_c47dfcf29abb.slice/crio-00771d946de1f1f70025ea505f340915513f997ca2d66fa1eee3e047f162fc71 WatchSource:0}: Error finding container 00771d946de1f1f70025ea505f340915513f997ca2d66fa1eee3e047f162fc71: Status 404 returned error can't find the container with id 00771d946de1f1f70025ea505f340915513f997ca2d66fa1eee3e047f162fc71 Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.343525 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.436647 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:03 crc kubenswrapper[4694]: I0217 17:05:03.756876 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kpsxt"] Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.002533 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4tzgw" event={"ID":"16a2de30-5a77-4179-b166-fcc003c41c17","Type":"ContainerStarted","Data":"e426f8ac564c7fb8d937d55a954486801a1888cee574223beb1e18d07bac18d5"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.004360 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerStarted","Data":"00771d946de1f1f70025ea505f340915513f997ca2d66fa1eee3e047f162fc71"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.006229 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" event={"ID":"003c20cf-819e-4d24-ba0b-a66652b8d5a3","Type":"ContainerStarted","Data":"cf562f9008053c879d98c22728873b6400c22c7d0e8c3b2050a277b38274ce77"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.019361 4694 generic.go:334] "Generic (PLEG): container finished" podID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerID="f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db" exitCode=0 Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.019435 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" event={"ID":"a12e4044-ba57-433d-9418-1a335dba1f0c","Type":"ContainerDied","Data":"f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.019464 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" event={"ID":"a12e4044-ba57-433d-9418-1a335dba1f0c","Type":"ContainerStarted","Data":"74260c4cdbe12c0854fe0712e2da1df50b1811d14e09ed846146c876fcd686a5"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.021209 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e81ac821-dfe2-4062-ac2c-82e9c82fac91","Type":"ContainerStarted","Data":"ddba52eb565c736f920920833c02beb752ed8939869f1920994fbb060114a456"} Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.025836 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4tzgw" podStartSLOduration=3.025819812 podStartE2EDuration="3.025819812s" podCreationTimestamp="2026-02-17 17:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:04.017074637 +0000 UTC m=+1371.774149961" watchObservedRunningTime="2026-02-17 17:05:04.025819812 +0000 UTC m=+1371.782895136" Feb 17 17:05:04 crc kubenswrapper[4694]: I0217 17:05:04.030010 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerStarted","Data":"a6f89b130ccb8cf47291a8258bea570d68c60021c9eeb899249da669df29f8a1"} Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.038318 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" event={"ID":"003c20cf-819e-4d24-ba0b-a66652b8d5a3","Type":"ContainerStarted","Data":"641e40d4546df29de4a08a0e11e9bc84004564bc7210af31fd35b53da7878083"} Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.043832 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" event={"ID":"a12e4044-ba57-433d-9418-1a335dba1f0c","Type":"ContainerStarted","Data":"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83"} Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.068073 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" podStartSLOduration=3.068055898 podStartE2EDuration="3.068055898s" podCreationTimestamp="2026-02-17 17:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:05.062820209 +0000 UTC m=+1372.819895533" watchObservedRunningTime="2026-02-17 17:05:05.068055898 +0000 UTC m=+1372.825131212" Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.102543 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" podStartSLOduration=3.102521787 podStartE2EDuration="3.102521787s" podCreationTimestamp="2026-02-17 17:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:05.088397099 +0000 UTC m=+1372.845472423" watchObservedRunningTime="2026-02-17 17:05:05.102521787 +0000 UTC m=+1372.859597111" Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.638417 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:05 crc kubenswrapper[4694]: I0217 17:05:05.715174 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:06 crc kubenswrapper[4694]: I0217 17:05:06.056625 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.066284 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerStarted","Data":"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.066661 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerStarted","Data":"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.069137 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerStarted","Data":"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.069167 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerStarted","Data":"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.069313 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-log" containerID="cri-o://ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" gracePeriod=30 Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.069318 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-metadata" containerID="cri-o://0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" gracePeriod=30 Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.070945 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2fba692a-e3f2-413e-9493-c401862d4626","Type":"ContainerStarted","Data":"0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.076084 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e81ac821-dfe2-4062-ac2c-82e9c82fac91","Type":"ContainerStarted","Data":"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0"} Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.076511 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0" gracePeriod=30 Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.092737 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.28472789 podStartE2EDuration="6.092714978s" podCreationTimestamp="2026-02-17 17:05:01 +0000 UTC" firstStartedPulling="2026-02-17 17:05:03.081863256 +0000 UTC m=+1370.838938580" lastFinishedPulling="2026-02-17 17:05:05.889850334 +0000 UTC m=+1373.646925668" observedRunningTime="2026-02-17 17:05:07.091303993 +0000 UTC m=+1374.848379357" watchObservedRunningTime="2026-02-17 17:05:07.092714978 +0000 UTC m=+1374.849790302" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.116108 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.228799292 podStartE2EDuration="6.116090934s" podCreationTimestamp="2026-02-17 17:05:01 +0000 UTC" firstStartedPulling="2026-02-17 17:05:02.951267329 +0000 UTC m=+1370.708342643" lastFinishedPulling="2026-02-17 17:05:05.838558961 +0000 UTC m=+1373.595634285" observedRunningTime="2026-02-17 17:05:07.115660594 +0000 UTC m=+1374.872735918" watchObservedRunningTime="2026-02-17 17:05:07.116090934 +0000 UTC m=+1374.873166258" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.155260 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.461136507 podStartE2EDuration="5.155233579s" podCreationTimestamp="2026-02-17 17:05:02 +0000 UTC" firstStartedPulling="2026-02-17 17:05:03.192380579 +0000 UTC m=+1370.949455903" lastFinishedPulling="2026-02-17 17:05:05.886477661 +0000 UTC m=+1373.643552975" observedRunningTime="2026-02-17 17:05:07.132091088 +0000 UTC m=+1374.889166412" watchObservedRunningTime="2026-02-17 17:05:07.155233579 +0000 UTC m=+1374.912308913" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.162333 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.61797524 podStartE2EDuration="5.162310843s" podCreationTimestamp="2026-02-17 17:05:02 +0000 UTC" firstStartedPulling="2026-02-17 17:05:03.343263856 +0000 UTC m=+1371.100339180" lastFinishedPulling="2026-02-17 17:05:05.887599439 +0000 UTC m=+1373.644674783" observedRunningTime="2026-02-17 17:05:07.151443815 +0000 UTC m=+1374.908519149" watchObservedRunningTime="2026-02-17 17:05:07.162310843 +0000 UTC m=+1374.919386177" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.293788 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.335707 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.673899 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.674229 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.752312 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.825400 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data\") pod \"74173fce-ed52-4c41-9285-c47dfcf29abb\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.825513 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs\") pod \"74173fce-ed52-4c41-9285-c47dfcf29abb\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.825636 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k2gq\" (UniqueName: \"kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq\") pod \"74173fce-ed52-4c41-9285-c47dfcf29abb\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.825677 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle\") pod \"74173fce-ed52-4c41-9285-c47dfcf29abb\" (UID: \"74173fce-ed52-4c41-9285-c47dfcf29abb\") " Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.826347 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs" (OuterVolumeSpecName: "logs") pod "74173fce-ed52-4c41-9285-c47dfcf29abb" (UID: "74173fce-ed52-4c41-9285-c47dfcf29abb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.832365 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq" (OuterVolumeSpecName: "kube-api-access-9k2gq") pod "74173fce-ed52-4c41-9285-c47dfcf29abb" (UID: "74173fce-ed52-4c41-9285-c47dfcf29abb"). InnerVolumeSpecName "kube-api-access-9k2gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.868978 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data" (OuterVolumeSpecName: "config-data") pod "74173fce-ed52-4c41-9285-c47dfcf29abb" (UID: "74173fce-ed52-4c41-9285-c47dfcf29abb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.877539 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74173fce-ed52-4c41-9285-c47dfcf29abb" (UID: "74173fce-ed52-4c41-9285-c47dfcf29abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.928489 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74173fce-ed52-4c41-9285-c47dfcf29abb-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.928600 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k2gq\" (UniqueName: \"kubernetes.io/projected/74173fce-ed52-4c41-9285-c47dfcf29abb-kube-api-access-9k2gq\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.928632 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.928643 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74173fce-ed52-4c41-9285-c47dfcf29abb-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:07 crc kubenswrapper[4694]: I0217 17:05:07.992583 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.036067 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.038514 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlllt\" (UniqueName: \"kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.038872 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.039038 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.039173 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.039381 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.039485 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts\") pod \"d525bca6-b587-4bac-a01b-d9b410ad69f6\" (UID: \"d525bca6-b587-4bac-a01b-d9b410ad69f6\") " Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.040487 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.040788 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.049095 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts" (OuterVolumeSpecName: "scripts") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.050324 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt" (OuterVolumeSpecName: "kube-api-access-wlllt") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "kube-api-access-wlllt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.063212 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.091393 4694 generic.go:334] "Generic (PLEG): container finished" podID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerID="99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f" exitCode=137 Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.091451 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.091465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerDied","Data":"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f"} Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.091846 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d525bca6-b587-4bac-a01b-d9b410ad69f6","Type":"ContainerDied","Data":"00eafaed39d3886045095a2c22bbc5831a2feba0bff6c59153c2a9b9894a43a4"} Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.091865 4694 scope.go:117] "RemoveContainer" containerID="99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100552 4694 generic.go:334] "Generic (PLEG): container finished" podID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerID="0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" exitCode=0 Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100578 4694 generic.go:334] "Generic (PLEG): container finished" podID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerID="ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" exitCode=143 Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100641 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerDied","Data":"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185"} Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100694 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerDied","Data":"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3"} Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100705 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74173fce-ed52-4c41-9285-c47dfcf29abb","Type":"ContainerDied","Data":"00771d946de1f1f70025ea505f340915513f997ca2d66fa1eee3e047f162fc71"} Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.100931 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.138135 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data" (OuterVolumeSpecName: "config-data") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142163 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d525bca6-b587-4bac-a01b-d9b410ad69f6" (UID: "d525bca6-b587-4bac-a01b-d9b410ad69f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142830 4694 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142862 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlllt\" (UniqueName: \"kubernetes.io/projected/d525bca6-b587-4bac-a01b-d9b410ad69f6-kube-api-access-wlllt\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142875 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142890 4694 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142901 4694 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d525bca6-b587-4bac-a01b-d9b410ad69f6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142911 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.142919 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d525bca6-b587-4bac-a01b-d9b410ad69f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.191894 4694 scope.go:117] "RemoveContainer" containerID="14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.193129 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.200147 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.224212 4694 scope.go:117] "RemoveContainer" containerID="1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.225789 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226245 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="proxy-httpd" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226266 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="proxy-httpd" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226283 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-metadata" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226293 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-metadata" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226305 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-log" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226312 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-log" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226321 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-central-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226328 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-central-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226346 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="sg-core" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226353 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="sg-core" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.226367 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-notification-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226376 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-notification-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226635 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="proxy-httpd" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226657 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-notification-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226678 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-metadata" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226696 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="sg-core" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226710 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" containerName="nova-metadata-log" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.226721 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" containerName="ceilometer-central-agent" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.227739 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.230732 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.230902 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.245021 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.245086 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpz5\" (UniqueName: \"kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.245380 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.245415 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.245443 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.265375 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.280375 4694 scope.go:117] "RemoveContainer" containerID="ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.295862 4694 scope.go:117] "RemoveContainer" containerID="99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.296238 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f\": container with ID starting with 99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f not found: ID does not exist" containerID="99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296280 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f"} err="failed to get container status \"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f\": rpc error: code = NotFound desc = could not find container \"99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f\": container with ID starting with 99838dd01b3ed7c8075e50ba1a17301771b1918994f4ea69d519b393c700080f not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296307 4694 scope.go:117] "RemoveContainer" containerID="14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.296559 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc\": container with ID starting with 14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc not found: ID does not exist" containerID="14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296591 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc"} err="failed to get container status \"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc\": rpc error: code = NotFound desc = could not find container \"14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc\": container with ID starting with 14d03fe53e218b68f74a53bc449ef857a2522ef222ef39a7f7f0571b2358ddbc not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296625 4694 scope.go:117] "RemoveContainer" containerID="1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.296802 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952\": container with ID starting with 1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952 not found: ID does not exist" containerID="1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296829 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952"} err="failed to get container status \"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952\": rpc error: code = NotFound desc = could not find container \"1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952\": container with ID starting with 1f7615b8ec60a62dd19ebcc8dddfff5c6a62aceb3427cf671fe68f0c407cc952 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.296843 4694 scope.go:117] "RemoveContainer" containerID="ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.297089 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15\": container with ID starting with ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15 not found: ID does not exist" containerID="ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.297110 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15"} err="failed to get container status \"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15\": rpc error: code = NotFound desc = could not find container \"ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15\": container with ID starting with ddd661cce3596c4d308366cb0d738c95abc7300143c89263ac1cf1b138422d15 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.297125 4694 scope.go:117] "RemoveContainer" containerID="0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.313052 4694 scope.go:117] "RemoveContainer" containerID="ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.329971 4694 scope.go:117] "RemoveContainer" containerID="0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.331419 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185\": container with ID starting with 0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185 not found: ID does not exist" containerID="0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.331451 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185"} err="failed to get container status \"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185\": rpc error: code = NotFound desc = could not find container \"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185\": container with ID starting with 0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.331474 4694 scope.go:117] "RemoveContainer" containerID="ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" Feb 17 17:05:08 crc kubenswrapper[4694]: E0217 17:05:08.331944 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3\": container with ID starting with ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3 not found: ID does not exist" containerID="ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.331965 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3"} err="failed to get container status \"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3\": rpc error: code = NotFound desc = could not find container \"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3\": container with ID starting with ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.331978 4694 scope.go:117] "RemoveContainer" containerID="0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.332275 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185"} err="failed to get container status \"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185\": rpc error: code = NotFound desc = could not find container \"0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185\": container with ID starting with 0d768b7a80c340878cb40c7e038d10e884ca27ba6b36772f046dcf696e7b7185 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.332312 4694 scope.go:117] "RemoveContainer" containerID="ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.332581 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3"} err="failed to get container status \"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3\": rpc error: code = NotFound desc = could not find container \"ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3\": container with ID starting with ac85ea9385b9b4a9b4f098c74ffe53152a77173505c246e4ea8c1916f2ec4bf3 not found: ID does not exist" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.347723 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.348115 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.348191 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.348640 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.348731 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.348826 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkpz5\" (UniqueName: \"kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.351264 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.351272 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.352357 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.362915 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkpz5\" (UniqueName: \"kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5\") pod \"nova-metadata-0\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.446260 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.455684 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.468638 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.471342 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.475149 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.475329 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.488299 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552188 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552285 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552309 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552403 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t2d8\" (UniqueName: \"kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552458 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552486 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.552530 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.555780 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655053 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t2d8\" (UniqueName: \"kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655333 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655375 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655431 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655491 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655549 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.655571 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.656115 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.656123 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.659232 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.660639 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.661014 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.662158 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.677387 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t2d8\" (UniqueName: \"kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8\") pod \"ceilometer-0\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.812760 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.919702 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74173fce-ed52-4c41-9285-c47dfcf29abb" path="/var/lib/kubelet/pods/74173fce-ed52-4c41-9285-c47dfcf29abb/volumes" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.920284 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d525bca6-b587-4bac-a01b-d9b410ad69f6" path="/var/lib/kubelet/pods/d525bca6-b587-4bac-a01b-d9b410ad69f6/volumes" Feb 17 17:05:08 crc kubenswrapper[4694]: I0217 17:05:08.992438 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:09 crc kubenswrapper[4694]: I0217 17:05:09.076878 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:09 crc kubenswrapper[4694]: W0217 17:05:09.095178 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696ed9e7_6ac5_4389_806f_5bc63b0a7412.slice/crio-bb8cce7411c2c5d97d4686fd617902943993a5119fc5157c30d7288f8f97be69 WatchSource:0}: Error finding container bb8cce7411c2c5d97d4686fd617902943993a5119fc5157c30d7288f8f97be69: Status 404 returned error can't find the container with id bb8cce7411c2c5d97d4686fd617902943993a5119fc5157c30d7288f8f97be69 Feb 17 17:05:09 crc kubenswrapper[4694]: I0217 17:05:09.135543 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerStarted","Data":"bb8cce7411c2c5d97d4686fd617902943993a5119fc5157c30d7288f8f97be69"} Feb 17 17:05:09 crc kubenswrapper[4694]: I0217 17:05:09.138895 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerStarted","Data":"5a2becca8ccce2cbfbf1bcf09fd24df5c3f803f782c55a13c03fe1b6e49ca14a"} Feb 17 17:05:10 crc kubenswrapper[4694]: I0217 17:05:10.150962 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerStarted","Data":"25041799ed886710f6e37c9d2c47409c93f5bd09aee90c18644a31d2bbaf5627"} Feb 17 17:05:10 crc kubenswrapper[4694]: I0217 17:05:10.153199 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerStarted","Data":"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718"} Feb 17 17:05:10 crc kubenswrapper[4694]: I0217 17:05:10.153377 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerStarted","Data":"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40"} Feb 17 17:05:10 crc kubenswrapper[4694]: I0217 17:05:10.186504 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.186477537 podStartE2EDuration="2.186477537s" podCreationTimestamp="2026-02-17 17:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:10.173407855 +0000 UTC m=+1377.930483209" watchObservedRunningTime="2026-02-17 17:05:10.186477537 +0000 UTC m=+1377.943552901" Feb 17 17:05:11 crc kubenswrapper[4694]: I0217 17:05:11.169502 4694 generic.go:334] "Generic (PLEG): container finished" podID="16a2de30-5a77-4179-b166-fcc003c41c17" containerID="e426f8ac564c7fb8d937d55a954486801a1888cee574223beb1e18d07bac18d5" exitCode=0 Feb 17 17:05:11 crc kubenswrapper[4694]: I0217 17:05:11.172397 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4tzgw" event={"ID":"16a2de30-5a77-4179-b166-fcc003c41c17","Type":"ContainerDied","Data":"e426f8ac564c7fb8d937d55a954486801a1888cee574223beb1e18d07bac18d5"} Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.183436 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerStarted","Data":"4ba76574abc3ec29a756436fc22e2ca7ab118b5cb2a5130e262a1beb285b061e"} Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.251194 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.251250 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.293301 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.323710 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.604093 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.695163 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.741635 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle\") pod \"16a2de30-5a77-4179-b166-fcc003c41c17\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.741726 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data\") pod \"16a2de30-5a77-4179-b166-fcc003c41c17\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.741770 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts\") pod \"16a2de30-5a77-4179-b166-fcc003c41c17\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.741813 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5kgn\" (UniqueName: \"kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn\") pod \"16a2de30-5a77-4179-b166-fcc003c41c17\" (UID: \"16a2de30-5a77-4179-b166-fcc003c41c17\") " Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.768472 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts" (OuterVolumeSpecName: "scripts") pod "16a2de30-5a77-4179-b166-fcc003c41c17" (UID: "16a2de30-5a77-4179-b166-fcc003c41c17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.768976 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn" (OuterVolumeSpecName: "kube-api-access-x5kgn") pod "16a2de30-5a77-4179-b166-fcc003c41c17" (UID: "16a2de30-5a77-4179-b166-fcc003c41c17"). InnerVolumeSpecName "kube-api-access-x5kgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.769228 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.769483 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="dnsmasq-dns" containerID="cri-o://6de2a7f3ec8ad114005d35a883e61ecf57e2dd74a6662179e7476023a4529232" gracePeriod=10 Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.806558 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a2de30-5a77-4179-b166-fcc003c41c17" (UID: "16a2de30-5a77-4179-b166-fcc003c41c17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.819686 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data" (OuterVolumeSpecName: "config-data") pod "16a2de30-5a77-4179-b166-fcc003c41c17" (UID: "16a2de30-5a77-4179-b166-fcc003c41c17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.845874 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.845915 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.845925 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a2de30-5a77-4179-b166-fcc003c41c17-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:12 crc kubenswrapper[4694]: I0217 17:05:12.845934 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5kgn\" (UniqueName: \"kubernetes.io/projected/16a2de30-5a77-4179-b166-fcc003c41c17-kube-api-access-x5kgn\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.194253 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4tzgw" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.194240 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4tzgw" event={"ID":"16a2de30-5a77-4179-b166-fcc003c41c17","Type":"ContainerDied","Data":"b37fdb9985cafa3bd41de71abed44d0663bda06a1610744d81da31a2c450c495"} Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.195403 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b37fdb9985cafa3bd41de71abed44d0663bda06a1610744d81da31a2c450c495" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.202730 4694 generic.go:334] "Generic (PLEG): container finished" podID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerID="6de2a7f3ec8ad114005d35a883e61ecf57e2dd74a6662179e7476023a4529232" exitCode=0 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.202808 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" event={"ID":"3142f4c0-5094-48ef-9151-77ee80fd3b41","Type":"ContainerDied","Data":"6de2a7f3ec8ad114005d35a883e61ecf57e2dd74a6662179e7476023a4529232"} Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.205829 4694 generic.go:334] "Generic (PLEG): container finished" podID="003c20cf-819e-4d24-ba0b-a66652b8d5a3" containerID="641e40d4546df29de4a08a0e11e9bc84004564bc7210af31fd35b53da7878083" exitCode=0 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.205896 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" event={"ID":"003c20cf-819e-4d24-ba0b-a66652b8d5a3","Type":"ContainerDied","Data":"641e40d4546df29de4a08a0e11e9bc84004564bc7210af31fd35b53da7878083"} Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.229026 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerStarted","Data":"8012ad17611604e8ceb366a7b6a41a98b3b4d6454bc9442b4c8c2b7da3a8cb36"} Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.229496 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252261 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252459 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252517 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252752 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbd4\" (UniqueName: \"kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252790 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.252825 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb\") pod \"3142f4c0-5094-48ef-9151-77ee80fd3b41\" (UID: \"3142f4c0-5094-48ef-9151-77ee80fd3b41\") " Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.267903 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4" (OuterVolumeSpecName: "kube-api-access-xkbd4") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "kube-api-access-xkbd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.291019 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.323137 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.333961 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.334268 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.361622 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbd4\" (UniqueName: \"kubernetes.io/projected/3142f4c0-5094-48ef-9151-77ee80fd3b41-kube-api-access-xkbd4\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.361656 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.380759 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.399502 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.399751 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-log" containerID="cri-o://5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac" gracePeriod=30 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.400294 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-api" containerID="cri-o://ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78" gracePeriod=30 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.405178 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config" (OuterVolumeSpecName: "config") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.415215 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.447677 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.447880 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-log" containerID="cri-o://2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" gracePeriod=30 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.448316 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-metadata" containerID="cri-o://a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" gracePeriod=30 Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.463677 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.463711 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.463724 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.468075 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3142f4c0-5094-48ef-9151-77ee80fd3b41" (UID: "3142f4c0-5094-48ef-9151-77ee80fd3b41"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.555870 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.555927 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.565769 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3142f4c0-5094-48ef-9151-77ee80fd3b41-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:13 crc kubenswrapper[4694]: I0217 17:05:13.817567 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:13 crc kubenswrapper[4694]: E0217 17:05:13.832844 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod140f99c0_41c1_40ca_a2e7_2a23e7485d71.slice/crio-conmon-a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.058328 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.178285 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs\") pod \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.178406 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs\") pod \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.178664 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs" (OuterVolumeSpecName: "logs") pod "140f99c0-41c1-40ca-a2e7-2a23e7485d71" (UID: "140f99c0-41c1-40ca-a2e7-2a23e7485d71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.178810 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkpz5\" (UniqueName: \"kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5\") pod \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.178845 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle\") pod \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.179129 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data\") pod \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\" (UID: \"140f99c0-41c1-40ca-a2e7-2a23e7485d71\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.179578 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/140f99c0-41c1-40ca-a2e7-2a23e7485d71-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.184307 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5" (OuterVolumeSpecName: "kube-api-access-hkpz5") pod "140f99c0-41c1-40ca-a2e7-2a23e7485d71" (UID: "140f99c0-41c1-40ca-a2e7-2a23e7485d71"). InnerVolumeSpecName "kube-api-access-hkpz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.208455 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "140f99c0-41c1-40ca-a2e7-2a23e7485d71" (UID: "140f99c0-41c1-40ca-a2e7-2a23e7485d71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.224560 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data" (OuterVolumeSpecName: "config-data") pod "140f99c0-41c1-40ca-a2e7-2a23e7485d71" (UID: "140f99c0-41c1-40ca-a2e7-2a23e7485d71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.254446 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerStarted","Data":"de5af6f938daf489c589a6bedfcd75049b6d34017f9745a64cc5776f76a36f51"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.256332 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.256487 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "140f99c0-41c1-40ca-a2e7-2a23e7485d71" (UID: "140f99c0-41c1-40ca-a2e7-2a23e7485d71"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262114 4694 generic.go:334] "Generic (PLEG): container finished" podID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerID="a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" exitCode=0 Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262144 4694 generic.go:334] "Generic (PLEG): container finished" podID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerID="2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" exitCode=143 Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262182 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerDied","Data":"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262207 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerDied","Data":"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262216 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"140f99c0-41c1-40ca-a2e7-2a23e7485d71","Type":"ContainerDied","Data":"5a2becca8ccce2cbfbf1bcf09fd24df5c3f803f782c55a13c03fe1b6e49ca14a"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262230 4694 scope.go:117] "RemoveContainer" containerID="a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.262331 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.269701 4694 generic.go:334] "Generic (PLEG): container finished" podID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerID="5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac" exitCode=143 Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.269773 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerDied","Data":"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.272795 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" event={"ID":"3142f4c0-5094-48ef-9151-77ee80fd3b41","Type":"ContainerDied","Data":"136848e0df83da5dcfe890ac32dd1e84f19c680f6f77f8749cda37d035b3f055"} Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.272945 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-dxv9h" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.282332 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.822161342 podStartE2EDuration="6.282317193s" podCreationTimestamp="2026-02-17 17:05:08 +0000 UTC" firstStartedPulling="2026-02-17 17:05:09.099010946 +0000 UTC m=+1376.856086280" lastFinishedPulling="2026-02-17 17:05:13.559166807 +0000 UTC m=+1381.316242131" observedRunningTime="2026-02-17 17:05:14.276454769 +0000 UTC m=+1382.033530093" watchObservedRunningTime="2026-02-17 17:05:14.282317193 +0000 UTC m=+1382.039392517" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.284506 4694 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.284524 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkpz5\" (UniqueName: \"kubernetes.io/projected/140f99c0-41c1-40ca-a2e7-2a23e7485d71-kube-api-access-hkpz5\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.284534 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.284543 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/140f99c0-41c1-40ca-a2e7-2a23e7485d71-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.329835 4694 scope.go:117] "RemoveContainer" containerID="2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.338841 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.355512 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.374780 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.375656 4694 scope.go:117] "RemoveContainer" containerID="a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.384177 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718\": container with ID starting with a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718 not found: ID does not exist" containerID="a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.384228 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718"} err="failed to get container status \"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718\": rpc error: code = NotFound desc = could not find container \"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718\": container with ID starting with a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718 not found: ID does not exist" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.384260 4694 scope.go:117] "RemoveContainer" containerID="2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.384882 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40\": container with ID starting with 2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40 not found: ID does not exist" containerID="2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.384915 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40"} err="failed to get container status \"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40\": rpc error: code = NotFound desc = could not find container \"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40\": container with ID starting with 2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40 not found: ID does not exist" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.384940 4694 scope.go:117] "RemoveContainer" containerID="a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.385194 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718"} err="failed to get container status \"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718\": rpc error: code = NotFound desc = could not find container \"a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718\": container with ID starting with a1518ef92a1e52add2343773115e09b153d404891e9d065d2a359b3aeba48718 not found: ID does not exist" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.385212 4694 scope.go:117] "RemoveContainer" containerID="2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.385369 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40"} err="failed to get container status \"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40\": rpc error: code = NotFound desc = could not find container \"2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40\": container with ID starting with 2c021c51f4cadb101a5fa8e66624b7588984ba599acfafc6f0bf2de08288af40 not found: ID does not exist" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.385384 4694 scope.go:117] "RemoveContainer" containerID="6de2a7f3ec8ad114005d35a883e61ecf57e2dd74a6662179e7476023a4529232" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.389391 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-dxv9h"] Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.400673 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.401075 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="dnsmasq-dns" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401092 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="dnsmasq-dns" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.401103 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="init" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401109 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="init" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.401120 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a2de30-5a77-4179-b166-fcc003c41c17" containerName="nova-manage" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401127 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a2de30-5a77-4179-b166-fcc003c41c17" containerName="nova-manage" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.401158 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-metadata" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401166 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-metadata" Feb 17 17:05:14 crc kubenswrapper[4694]: E0217 17:05:14.401191 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-log" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401199 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-log" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401390 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-metadata" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401416 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a2de30-5a77-4179-b166-fcc003c41c17" containerName="nova-manage" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401429 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" containerName="nova-metadata-log" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.401444 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" containerName="dnsmasq-dns" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.402522 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.407341 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.407356 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.418915 4694 scope.go:117] "RemoveContainer" containerID="527d7558fd44b5892dc84e505a5708954dcb13334b28b652122a63ed1d382550" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.427902 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.588661 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.588718 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.588793 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.588849 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.588893 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspwq\" (UniqueName: \"kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.622052 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.622096 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.622138 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.622844 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.622889 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6" gracePeriod=600 Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.690510 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lspwq\" (UniqueName: \"kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.690621 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.690664 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.690753 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.690823 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.691153 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.697525 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.699130 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.701138 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.711972 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspwq\" (UniqueName: \"kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq\") pod \"nova-metadata-0\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.728322 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.855878 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.906481 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140f99c0-41c1-40ca-a2e7-2a23e7485d71" path="/var/lib/kubelet/pods/140f99c0-41c1-40ca-a2e7-2a23e7485d71/volumes" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.907452 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3142f4c0-5094-48ef-9151-77ee80fd3b41" path="/var/lib/kubelet/pods/3142f4c0-5094-48ef-9151-77ee80fd3b41/volumes" Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.997984 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd9zk\" (UniqueName: \"kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk\") pod \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.998077 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data\") pod \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.998108 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle\") pod \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " Feb 17 17:05:14 crc kubenswrapper[4694]: I0217 17:05:14.998153 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts\") pod \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\" (UID: \"003c20cf-819e-4d24-ba0b-a66652b8d5a3\") " Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.005168 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk" (OuterVolumeSpecName: "kube-api-access-vd9zk") pod "003c20cf-819e-4d24-ba0b-a66652b8d5a3" (UID: "003c20cf-819e-4d24-ba0b-a66652b8d5a3"). InnerVolumeSpecName "kube-api-access-vd9zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.007531 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts" (OuterVolumeSpecName: "scripts") pod "003c20cf-819e-4d24-ba0b-a66652b8d5a3" (UID: "003c20cf-819e-4d24-ba0b-a66652b8d5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.040490 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data" (OuterVolumeSpecName: "config-data") pod "003c20cf-819e-4d24-ba0b-a66652b8d5a3" (UID: "003c20cf-819e-4d24-ba0b-a66652b8d5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.056305 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "003c20cf-819e-4d24-ba0b-a66652b8d5a3" (UID: "003c20cf-819e-4d24-ba0b-a66652b8d5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.101167 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd9zk\" (UniqueName: \"kubernetes.io/projected/003c20cf-819e-4d24-ba0b-a66652b8d5a3-kube-api-access-vd9zk\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.101209 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.101219 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.101227 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c20cf-819e-4d24-ba0b-a66652b8d5a3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.230926 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.315812 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerStarted","Data":"f25e25af98c5bf0ed5b64f7c689c4db5cb643b4ac04211d49ed6353d96dd7b6f"} Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.318397 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 17:05:15 crc kubenswrapper[4694]: E0217 17:05:15.318779 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003c20cf-819e-4d24-ba0b-a66652b8d5a3" containerName="nova-cell1-conductor-db-sync" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.318792 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="003c20cf-819e-4d24-ba0b-a66652b8d5a3" containerName="nova-cell1-conductor-db-sync" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.318966 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="003c20cf-819e-4d24-ba0b-a66652b8d5a3" containerName="nova-cell1-conductor-db-sync" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.319519 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.325294 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6" exitCode=0 Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.325355 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6"} Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.325391 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f"} Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.325412 4694 scope.go:117] "RemoveContainer" containerID="5aa651e570a8961f4584e9fe11d3f397047e9a6daf1e15f72d714be968799658" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.333887 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.342877 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.343673 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kpsxt" event={"ID":"003c20cf-819e-4d24-ba0b-a66652b8d5a3","Type":"ContainerDied","Data":"cf562f9008053c879d98c22728873b6400c22c7d0e8c3b2050a277b38274ce77"} Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.343728 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf562f9008053c879d98c22728873b6400c22c7d0e8c3b2050a277b38274ce77" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.343961 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2fba692a-e3f2-413e-9493-c401862d4626" containerName="nova-scheduler-scheduler" containerID="cri-o://0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" gracePeriod=30 Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.509621 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.509695 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.509746 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzpcz\" (UniqueName: \"kubernetes.io/projected/91495551-d244-497c-b8f1-376b3206a3aa-kube-api-access-hzpcz\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.611279 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.611339 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.611387 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzpcz\" (UniqueName: \"kubernetes.io/projected/91495551-d244-497c-b8f1-376b3206a3aa-kube-api-access-hzpcz\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.617588 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.624520 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91495551-d244-497c-b8f1-376b3206a3aa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.636490 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzpcz\" (UniqueName: \"kubernetes.io/projected/91495551-d244-497c-b8f1-376b3206a3aa-kube-api-access-hzpcz\") pod \"nova-cell1-conductor-0\" (UID: \"91495551-d244-497c-b8f1-376b3206a3aa\") " pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:15 crc kubenswrapper[4694]: I0217 17:05:15.643085 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.078064 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.356139 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerStarted","Data":"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d"} Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.358131 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerStarted","Data":"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b"} Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.363542 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"91495551-d244-497c-b8f1-376b3206a3aa","Type":"ContainerStarted","Data":"03e2a63b00a05d55e17fef90bedd6e49a64d3feb92d3f01047da9903ffb638ee"} Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.363571 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"91495551-d244-497c-b8f1-376b3206a3aa","Type":"ContainerStarted","Data":"254098652b0ce6c60ae71ebd952d88192f00084e66acd61492b416b9deb70e79"} Feb 17 17:05:16 crc kubenswrapper[4694]: I0217 17:05:16.385134 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.385112638 podStartE2EDuration="2.385112638s" podCreationTimestamp="2026-02-17 17:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:16.376774332 +0000 UTC m=+1384.133849666" watchObservedRunningTime="2026-02-17 17:05:16.385112638 +0000 UTC m=+1384.142187972" Feb 17 17:05:17 crc kubenswrapper[4694]: E0217 17:05:17.295264 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:17 crc kubenswrapper[4694]: E0217 17:05:17.297118 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:17 crc kubenswrapper[4694]: E0217 17:05:17.298493 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:17 crc kubenswrapper[4694]: E0217 17:05:17.298571 4694 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2fba692a-e3f2-413e-9493-c401862d4626" containerName="nova-scheduler-scheduler" Feb 17 17:05:17 crc kubenswrapper[4694]: I0217 17:05:17.375072 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:17 crc kubenswrapper[4694]: I0217 17:05:17.405421 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.405390604 podStartE2EDuration="2.405390604s" podCreationTimestamp="2026-02-17 17:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:17.389678747 +0000 UTC m=+1385.146754121" watchObservedRunningTime="2026-02-17 17:05:17.405390604 +0000 UTC m=+1385.162465958" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.383759 4694 generic.go:334] "Generic (PLEG): container finished" podID="2fba692a-e3f2-413e-9493-c401862d4626" containerID="0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" exitCode=0 Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.383855 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2fba692a-e3f2-413e-9493-c401862d4626","Type":"ContainerDied","Data":"0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61"} Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.384055 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2fba692a-e3f2-413e-9493-c401862d4626","Type":"ContainerDied","Data":"6975d049100c1567a326a3c3c33c510cd07053c4544ae9eaca801bbd11a2a303"} Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.384069 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6975d049100c1567a326a3c3c33c510cd07053c4544ae9eaca801bbd11a2a303" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.424933 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.580204 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data\") pod \"2fba692a-e3f2-413e-9493-c401862d4626\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.580295 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqlm2\" (UniqueName: \"kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2\") pod \"2fba692a-e3f2-413e-9493-c401862d4626\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.580554 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle\") pod \"2fba692a-e3f2-413e-9493-c401862d4626\" (UID: \"2fba692a-e3f2-413e-9493-c401862d4626\") " Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.586769 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2" (OuterVolumeSpecName: "kube-api-access-jqlm2") pod "2fba692a-e3f2-413e-9493-c401862d4626" (UID: "2fba692a-e3f2-413e-9493-c401862d4626"). InnerVolumeSpecName "kube-api-access-jqlm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.614790 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fba692a-e3f2-413e-9493-c401862d4626" (UID: "2fba692a-e3f2-413e-9493-c401862d4626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.620827 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data" (OuterVolumeSpecName: "config-data") pod "2fba692a-e3f2-413e-9493-c401862d4626" (UID: "2fba692a-e3f2-413e-9493-c401862d4626"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.682557 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.682713 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqlm2\" (UniqueName: \"kubernetes.io/projected/2fba692a-e3f2-413e-9493-c401862d4626-kube-api-access-jqlm2\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:18 crc kubenswrapper[4694]: I0217 17:05:18.682788 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fba692a-e3f2-413e-9493-c401862d4626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.342059 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418428 4694 generic.go:334] "Generic (PLEG): container finished" podID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerID="ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78" exitCode=0 Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418539 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418873 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerDied","Data":"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78"} Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418909 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418928 4694 scope.go:117] "RemoveContainer" containerID="ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.418918 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"717ffc09-ec9c-4445-accd-340dd9f758d9","Type":"ContainerDied","Data":"a6f89b130ccb8cf47291a8258bea570d68c60021c9eeb899249da669df29f8a1"} Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.445415 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.453904 4694 scope.go:117] "RemoveContainer" containerID="5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.467652 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.478524 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: E0217 17:05:19.479243 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-api" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479264 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-api" Feb 17 17:05:19 crc kubenswrapper[4694]: E0217 17:05:19.479289 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-log" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479296 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-log" Feb 17 17:05:19 crc kubenswrapper[4694]: E0217 17:05:19.479314 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fba692a-e3f2-413e-9493-c401862d4626" containerName="nova-scheduler-scheduler" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479355 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fba692a-e3f2-413e-9493-c401862d4626" containerName="nova-scheduler-scheduler" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479581 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fba692a-e3f2-413e-9493-c401862d4626" containerName="nova-scheduler-scheduler" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479636 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-log" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.479653 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" containerName="nova-api-api" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.480761 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.483542 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.485558 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.492560 4694 scope.go:117] "RemoveContainer" containerID="ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78" Feb 17 17:05:19 crc kubenswrapper[4694]: E0217 17:05:19.493084 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78\": container with ID starting with ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78 not found: ID does not exist" containerID="ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.493123 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78"} err="failed to get container status \"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78\": rpc error: code = NotFound desc = could not find container \"ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78\": container with ID starting with ea4e40ea8f14989e1863eef47cd3e00525ee0c24e1ef5cf1b22a272844b2da78 not found: ID does not exist" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.493147 4694 scope.go:117] "RemoveContainer" containerID="5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac" Feb 17 17:05:19 crc kubenswrapper[4694]: E0217 17:05:19.493645 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac\": container with ID starting with 5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac not found: ID does not exist" containerID="5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.493749 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac"} err="failed to get container status \"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac\": rpc error: code = NotFound desc = could not find container \"5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac\": container with ID starting with 5c2205838a950208dc5dd32d5ff30e0b778d46b32e34617b782628a7a5e486ac not found: ID does not exist" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.500548 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs\") pod \"717ffc09-ec9c-4445-accd-340dd9f758d9\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.500714 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle\") pod \"717ffc09-ec9c-4445-accd-340dd9f758d9\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.500767 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9g8r\" (UniqueName: \"kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r\") pod \"717ffc09-ec9c-4445-accd-340dd9f758d9\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.500825 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data\") pod \"717ffc09-ec9c-4445-accd-340dd9f758d9\" (UID: \"717ffc09-ec9c-4445-accd-340dd9f758d9\") " Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.505238 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs" (OuterVolumeSpecName: "logs") pod "717ffc09-ec9c-4445-accd-340dd9f758d9" (UID: "717ffc09-ec9c-4445-accd-340dd9f758d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.522462 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r" (OuterVolumeSpecName: "kube-api-access-c9g8r") pod "717ffc09-ec9c-4445-accd-340dd9f758d9" (UID: "717ffc09-ec9c-4445-accd-340dd9f758d9"). InnerVolumeSpecName "kube-api-access-c9g8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.533436 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "717ffc09-ec9c-4445-accd-340dd9f758d9" (UID: "717ffc09-ec9c-4445-accd-340dd9f758d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.537133 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data" (OuterVolumeSpecName: "config-data") pod "717ffc09-ec9c-4445-accd-340dd9f758d9" (UID: "717ffc09-ec9c-4445-accd-340dd9f758d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603241 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603345 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603535 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45zf\" (UniqueName: \"kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603896 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603914 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9g8r\" (UniqueName: \"kubernetes.io/projected/717ffc09-ec9c-4445-accd-340dd9f758d9-kube-api-access-c9g8r\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603925 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717ffc09-ec9c-4445-accd-340dd9f758d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.603935 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717ffc09-ec9c-4445-accd-340dd9f758d9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.705233 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.705793 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.705864 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45zf\" (UniqueName: \"kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.709678 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.709947 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.724155 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45zf\" (UniqueName: \"kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf\") pod \"nova-scheduler-0\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.729392 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.729442 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.761040 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.772387 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.782329 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.784039 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.787828 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.801173 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.897784 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.909746 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jrgb\" (UniqueName: \"kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.909915 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.910164 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:19 crc kubenswrapper[4694]: I0217 17:05:19.910200 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.011964 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jrgb\" (UniqueName: \"kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.012025 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.012108 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.012135 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.013855 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.017998 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.023481 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.030390 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jrgb\" (UniqueName: \"kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb\") pod \"nova-api-0\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.101471 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.319837 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.450804 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8475e1ad-422c-4ae0-9230-052dc4e0e8fc","Type":"ContainerStarted","Data":"a49985fb6964b7688d0cf3cbfbe21f3aa3f45e91544533c2ab03e45a518c776a"} Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.568647 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:20 crc kubenswrapper[4694]: W0217 17:05:20.573996 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7100d71c_f2cd_4ebd_a26d_825cac8bd1e4.slice/crio-67b7dc1b4a11aed2e7106838f3f2b113c7a833a1a818d631b4ca33365292a90e WatchSource:0}: Error finding container 67b7dc1b4a11aed2e7106838f3f2b113c7a833a1a818d631b4ca33365292a90e: Status 404 returned error can't find the container with id 67b7dc1b4a11aed2e7106838f3f2b113c7a833a1a818d631b4ca33365292a90e Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.908145 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fba692a-e3f2-413e-9493-c401862d4626" path="/var/lib/kubelet/pods/2fba692a-e3f2-413e-9493-c401862d4626/volumes" Feb 17 17:05:20 crc kubenswrapper[4694]: I0217 17:05:20.909261 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717ffc09-ec9c-4445-accd-340dd9f758d9" path="/var/lib/kubelet/pods/717ffc09-ec9c-4445-accd-340dd9f758d9/volumes" Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.463010 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerStarted","Data":"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da"} Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.463377 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerStarted","Data":"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf"} Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.463392 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerStarted","Data":"67b7dc1b4a11aed2e7106838f3f2b113c7a833a1a818d631b4ca33365292a90e"} Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.464872 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8475e1ad-422c-4ae0-9230-052dc4e0e8fc","Type":"ContainerStarted","Data":"91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b"} Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.518301 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.51826959 podStartE2EDuration="2.51826959s" podCreationTimestamp="2026-02-17 17:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:21.480821308 +0000 UTC m=+1389.237896672" watchObservedRunningTime="2026-02-17 17:05:21.51826959 +0000 UTC m=+1389.275344954" Feb 17 17:05:21 crc kubenswrapper[4694]: I0217 17:05:21.530355 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5303292170000002 podStartE2EDuration="2.530329217s" podCreationTimestamp="2026-02-17 17:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:21.502558373 +0000 UTC m=+1389.259633727" watchObservedRunningTime="2026-02-17 17:05:21.530329217 +0000 UTC m=+1389.287404571" Feb 17 17:05:24 crc kubenswrapper[4694]: I0217 17:05:24.728731 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 17:05:24 crc kubenswrapper[4694]: I0217 17:05:24.729163 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 17:05:24 crc kubenswrapper[4694]: I0217 17:05:24.907113 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 17:05:25 crc kubenswrapper[4694]: I0217 17:05:25.673390 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 17:05:25 crc kubenswrapper[4694]: I0217 17:05:25.743297 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:25 crc kubenswrapper[4694]: I0217 17:05:25.743344 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:29 crc kubenswrapper[4694]: I0217 17:05:29.898343 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 17:05:29 crc kubenswrapper[4694]: I0217 17:05:29.929125 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 17:05:30 crc kubenswrapper[4694]: I0217 17:05:30.102403 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:30 crc kubenswrapper[4694]: I0217 17:05:30.102484 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:30 crc kubenswrapper[4694]: I0217 17:05:30.607657 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 17:05:31 crc kubenswrapper[4694]: I0217 17:05:31.184820 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:31 crc kubenswrapper[4694]: I0217 17:05:31.185159 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:34 crc kubenswrapper[4694]: I0217 17:05:34.733329 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 17:05:34 crc kubenswrapper[4694]: I0217 17:05:34.736736 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 17:05:34 crc kubenswrapper[4694]: I0217 17:05:34.743423 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 17:05:35 crc kubenswrapper[4694]: I0217 17:05:35.620279 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.531564 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.635703 4694 generic.go:334] "Generic (PLEG): container finished" podID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" containerID="ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0" exitCode=137 Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.636453 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.636929 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e81ac821-dfe2-4062-ac2c-82e9c82fac91","Type":"ContainerDied","Data":"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0"} Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.636958 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e81ac821-dfe2-4062-ac2c-82e9c82fac91","Type":"ContainerDied","Data":"ddba52eb565c736f920920833c02beb752ed8939869f1920994fbb060114a456"} Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.636975 4694 scope.go:117] "RemoveContainer" containerID="ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.641741 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data\") pod \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.641860 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle\") pod \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.641985 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpcth\" (UniqueName: \"kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth\") pod \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\" (UID: \"e81ac821-dfe2-4062-ac2c-82e9c82fac91\") " Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.653861 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth" (OuterVolumeSpecName: "kube-api-access-cpcth") pod "e81ac821-dfe2-4062-ac2c-82e9c82fac91" (UID: "e81ac821-dfe2-4062-ac2c-82e9c82fac91"). InnerVolumeSpecName "kube-api-access-cpcth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.667189 4694 scope.go:117] "RemoveContainer" containerID="ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0" Feb 17 17:05:37 crc kubenswrapper[4694]: E0217 17:05:37.669424 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0\": container with ID starting with ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0 not found: ID does not exist" containerID="ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.669471 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0"} err="failed to get container status \"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0\": rpc error: code = NotFound desc = could not find container \"ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0\": container with ID starting with ebfd69c421bb6c694d52c6b2ae6509460f27f11b083b4900fd89cb63e8cebdb0 not found: ID does not exist" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.673047 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data" (OuterVolumeSpecName: "config-data") pod "e81ac821-dfe2-4062-ac2c-82e9c82fac91" (UID: "e81ac821-dfe2-4062-ac2c-82e9c82fac91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.717845 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e81ac821-dfe2-4062-ac2c-82e9c82fac91" (UID: "e81ac821-dfe2-4062-ac2c-82e9c82fac91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.746979 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpcth\" (UniqueName: \"kubernetes.io/projected/e81ac821-dfe2-4062-ac2c-82e9c82fac91-kube-api-access-cpcth\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.747222 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.747298 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81ac821-dfe2-4062-ac2c-82e9c82fac91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:37 crc kubenswrapper[4694]: I0217 17:05:37.991119 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.008744 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.038068 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:38 crc kubenswrapper[4694]: E0217 17:05:38.038750 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.038770 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.039199 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.040178 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.049484 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.055542 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.055936 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.056159 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.155571 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.155679 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.155840 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.155919 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.155992 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfn75\" (UniqueName: \"kubernetes.io/projected/9894d581-eaea-45f8-a4ca-1a73c9fc778b-kube-api-access-gfn75\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.257762 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.258162 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.258873 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.258953 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.258988 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfn75\" (UniqueName: \"kubernetes.io/projected/9894d581-eaea-45f8-a4ca-1a73c9fc778b-kube-api-access-gfn75\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.263734 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.263777 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.263744 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.270025 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894d581-eaea-45f8-a4ca-1a73c9fc778b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.278779 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfn75\" (UniqueName: \"kubernetes.io/projected/9894d581-eaea-45f8-a4ca-1a73c9fc778b-kube-api-access-gfn75\") pod \"nova-cell1-novncproxy-0\" (UID: \"9894d581-eaea-45f8-a4ca-1a73c9fc778b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.358549 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.827706 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.828430 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 17:05:38 crc kubenswrapper[4694]: I0217 17:05:38.905392 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81ac821-dfe2-4062-ac2c-82e9c82fac91" path="/var/lib/kubelet/pods/e81ac821-dfe2-4062-ac2c-82e9c82fac91/volumes" Feb 17 17:05:39 crc kubenswrapper[4694]: I0217 17:05:39.656039 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9894d581-eaea-45f8-a4ca-1a73c9fc778b","Type":"ContainerStarted","Data":"2b5d1abeb9aa06c62aee2bfd8ecf5cbd4fb245cfb03dec0daafe22103613ba3f"} Feb 17 17:05:39 crc kubenswrapper[4694]: I0217 17:05:39.656646 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9894d581-eaea-45f8-a4ca-1a73c9fc778b","Type":"ContainerStarted","Data":"0c9fca4500b822e51633987f17a407e5468aac453380ff1f49d27d5b5083279f"} Feb 17 17:05:39 crc kubenswrapper[4694]: I0217 17:05:39.691226 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.691207223 podStartE2EDuration="2.691207223s" podCreationTimestamp="2026-02-17 17:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:39.674149913 +0000 UTC m=+1407.431225247" watchObservedRunningTime="2026-02-17 17:05:39.691207223 +0000 UTC m=+1407.448282557" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.113334 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.113911 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.114279 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.114333 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.118189 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.118477 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.337230 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.340425 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.361973 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.502940 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.503007 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.503082 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrh6n\" (UniqueName: \"kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.503125 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.503155 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.503200 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.606696 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.606818 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.606854 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.606913 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrh6n\" (UniqueName: \"kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.606966 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.607009 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.608338 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.608480 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.608780 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.608975 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.609261 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.639927 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrh6n\" (UniqueName: \"kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n\") pod \"dnsmasq-dns-5c7b6c5df9-8qv52\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:40 crc kubenswrapper[4694]: I0217 17:05:40.671596 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:41 crc kubenswrapper[4694]: I0217 17:05:41.164090 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:05:41 crc kubenswrapper[4694]: W0217 17:05:41.166059 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf079a4f8_0663_4400_a495_b684a3cf7ef9.slice/crio-5c54de90259f959cf1bfa3dcc4926347ebdfa824a71e257645be9abba1925173 WatchSource:0}: Error finding container 5c54de90259f959cf1bfa3dcc4926347ebdfa824a71e257645be9abba1925173: Status 404 returned error can't find the container with id 5c54de90259f959cf1bfa3dcc4926347ebdfa824a71e257645be9abba1925173 Feb 17 17:05:41 crc kubenswrapper[4694]: I0217 17:05:41.680646 4694 generic.go:334] "Generic (PLEG): container finished" podID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerID="ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb" exitCode=0 Feb 17 17:05:41 crc kubenswrapper[4694]: I0217 17:05:41.680730 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" event={"ID":"f079a4f8-0663-4400-a495-b684a3cf7ef9","Type":"ContainerDied","Data":"ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb"} Feb 17 17:05:41 crc kubenswrapper[4694]: I0217 17:05:41.681038 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" event={"ID":"f079a4f8-0663-4400-a495-b684a3cf7ef9","Type":"ContainerStarted","Data":"5c54de90259f959cf1bfa3dcc4926347ebdfa824a71e257645be9abba1925173"} Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.687226 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.692658 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-log" containerID="cri-o://f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf" gracePeriod=30 Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.693463 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" event={"ID":"f079a4f8-0663-4400-a495-b684a3cf7ef9","Type":"ContainerStarted","Data":"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d"} Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.693495 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.693679 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-api" containerID="cri-o://4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da" gracePeriod=30 Feb 17 17:05:42 crc kubenswrapper[4694]: I0217 17:05:42.736137 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" podStartSLOduration=2.736119629 podStartE2EDuration="2.736119629s" podCreationTimestamp="2026-02-17 17:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:42.719282684 +0000 UTC m=+1410.476358028" watchObservedRunningTime="2026-02-17 17:05:42.736119629 +0000 UTC m=+1410.493194943" Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.228656 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.229126 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-central-agent" containerID="cri-o://25041799ed886710f6e37c9d2c47409c93f5bd09aee90c18644a31d2bbaf5627" gracePeriod=30 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.229140 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="proxy-httpd" containerID="cri-o://de5af6f938daf489c589a6bedfcd75049b6d34017f9745a64cc5776f76a36f51" gracePeriod=30 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.229263 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-notification-agent" containerID="cri-o://4ba76574abc3ec29a756436fc22e2ca7ab118b5cb2a5130e262a1beb285b061e" gracePeriod=30 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.229301 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="sg-core" containerID="cri-o://8012ad17611604e8ceb366a7b6a41a98b3b4d6454bc9442b4c8c2b7da3a8cb36" gracePeriod=30 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.359263 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705306 4694 generic.go:334] "Generic (PLEG): container finished" podID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerID="de5af6f938daf489c589a6bedfcd75049b6d34017f9745a64cc5776f76a36f51" exitCode=0 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705348 4694 generic.go:334] "Generic (PLEG): container finished" podID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerID="8012ad17611604e8ceb366a7b6a41a98b3b4d6454bc9442b4c8c2b7da3a8cb36" exitCode=2 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705358 4694 generic.go:334] "Generic (PLEG): container finished" podID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerID="25041799ed886710f6e37c9d2c47409c93f5bd09aee90c18644a31d2bbaf5627" exitCode=0 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705411 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerDied","Data":"de5af6f938daf489c589a6bedfcd75049b6d34017f9745a64cc5776f76a36f51"} Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705444 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerDied","Data":"8012ad17611604e8ceb366a7b6a41a98b3b4d6454bc9442b4c8c2b7da3a8cb36"} Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.705459 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerDied","Data":"25041799ed886710f6e37c9d2c47409c93f5bd09aee90c18644a31d2bbaf5627"} Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.708348 4694 generic.go:334] "Generic (PLEG): container finished" podID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerID="f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf" exitCode=143 Feb 17 17:05:43 crc kubenswrapper[4694]: I0217 17:05:43.708638 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerDied","Data":"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf"} Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.310895 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.415659 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle\") pod \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.415718 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jrgb\" (UniqueName: \"kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb\") pod \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.415767 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data\") pod \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.415786 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs\") pod \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\" (UID: \"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4\") " Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.416471 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs" (OuterVolumeSpecName: "logs") pod "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" (UID: "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.432020 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb" (OuterVolumeSpecName: "kube-api-access-9jrgb") pod "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" (UID: "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4"). InnerVolumeSpecName "kube-api-access-9jrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.444078 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data" (OuterVolumeSpecName: "config-data") pod "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" (UID: "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.445816 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" (UID: "7100d71c-f2cd-4ebd-a26d-825cac8bd1e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.518084 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.518129 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jrgb\" (UniqueName: \"kubernetes.io/projected/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-kube-api-access-9jrgb\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.518144 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.518155 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.735826 4694 generic.go:334] "Generic (PLEG): container finished" podID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerID="4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da" exitCode=0 Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.735866 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerDied","Data":"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da"} Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.735905 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7100d71c-f2cd-4ebd-a26d-825cac8bd1e4","Type":"ContainerDied","Data":"67b7dc1b4a11aed2e7106838f3f2b113c7a833a1a818d631b4ca33365292a90e"} Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.735871 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.735921 4694 scope.go:117] "RemoveContainer" containerID="4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.741773 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.741984 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" containerName="kube-state-metrics" containerID="cri-o://59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8" gracePeriod=30 Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.761686 4694 scope.go:117] "RemoveContainer" containerID="f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.776281 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.784106 4694 scope.go:117] "RemoveContainer" containerID="4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da" Feb 17 17:05:46 crc kubenswrapper[4694]: E0217 17:05:46.784690 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da\": container with ID starting with 4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da not found: ID does not exist" containerID="4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.784736 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da"} err="failed to get container status \"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da\": rpc error: code = NotFound desc = could not find container \"4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da\": container with ID starting with 4cb9ad7b77b1ddeb4c13c75c64bee3801b7d42a13be507707cdebe87a49599da not found: ID does not exist" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.784763 4694 scope.go:117] "RemoveContainer" containerID="f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf" Feb 17 17:05:46 crc kubenswrapper[4694]: E0217 17:05:46.785166 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf\": container with ID starting with f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf not found: ID does not exist" containerID="f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.785194 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf"} err="failed to get container status \"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf\": rpc error: code = NotFound desc = could not find container \"f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf\": container with ID starting with f71cc29110d604ce51ec0ae5b4be87e79dfdcde09189ac749bc870151ae8bbbf not found: ID does not exist" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.785957 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.804465 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:46 crc kubenswrapper[4694]: E0217 17:05:46.804900 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-api" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.804923 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-api" Feb 17 17:05:46 crc kubenswrapper[4694]: E0217 17:05:46.804945 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-log" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.804954 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-log" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.805173 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-api" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.805208 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" containerName="nova-api-log" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.806136 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.813168 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.813406 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.813531 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.816486 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.909477 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7100d71c-f2cd-4ebd-a26d-825cac8bd1e4" path="/var/lib/kubelet/pods/7100d71c-f2cd-4ebd-a26d-825cac8bd1e4/volumes" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929139 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929216 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445z7\" (UniqueName: \"kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929260 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929375 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929432 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:46 crc kubenswrapper[4694]: I0217 17:05:46.929644 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.031133 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.034154 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.034262 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-445z7\" (UniqueName: \"kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.034314 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.034407 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.034470 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.035109 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.037987 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.041068 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.041441 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.054127 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.054362 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-445z7\" (UniqueName: \"kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7\") pod \"nova-api-0\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.155574 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.275067 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.444343 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgbfr\" (UniqueName: \"kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr\") pod \"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0\" (UID: \"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0\") " Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.449558 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr" (OuterVolumeSpecName: "kube-api-access-qgbfr") pod "dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" (UID: "dbcfb3bf-c557-4a38-855e-4be0f77b3ab0"). InnerVolumeSpecName "kube-api-access-qgbfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.547063 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgbfr\" (UniqueName: \"kubernetes.io/projected/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0-kube-api-access-qgbfr\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.653070 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.746926 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerStarted","Data":"7ea7126c84b25fc479078804145e8bfae18efcbf86897b09834abfa0b6c5a1c9"} Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.748673 4694 generic.go:334] "Generic (PLEG): container finished" podID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" containerID="59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8" exitCode=2 Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.748707 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0","Type":"ContainerDied","Data":"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8"} Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.748728 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dbcfb3bf-c557-4a38-855e-4be0f77b3ab0","Type":"ContainerDied","Data":"dc1df6f66b8a912b9919e5143896ab457e090809278ab63c016f85a74fa75010"} Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.748745 4694 scope.go:117] "RemoveContainer" containerID="59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.748787 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.767308 4694 scope.go:117] "RemoveContainer" containerID="59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8" Feb 17 17:05:47 crc kubenswrapper[4694]: E0217 17:05:47.767770 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8\": container with ID starting with 59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8 not found: ID does not exist" containerID="59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.768492 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8"} err="failed to get container status \"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8\": rpc error: code = NotFound desc = could not find container \"59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8\": container with ID starting with 59c40b970a648367ffae84d34359976ca088d8a15908a54648591853fe7e28c8 not found: ID does not exist" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.786109 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.798717 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.809823 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:47 crc kubenswrapper[4694]: E0217 17:05:47.810275 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" containerName="kube-state-metrics" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.810287 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" containerName="kube-state-metrics" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.810509 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" containerName="kube-state-metrics" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.811148 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.814876 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.814957 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.846384 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.954079 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.954296 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.954407 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:47 crc kubenswrapper[4694]: I0217 17:05:47.954516 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjdq\" (UniqueName: \"kubernetes.io/projected/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-api-access-2jjdq\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.056446 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.056882 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.056997 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.057136 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjdq\" (UniqueName: \"kubernetes.io/projected/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-api-access-2jjdq\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.059953 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.062368 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.068762 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.079033 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjdq\" (UniqueName: \"kubernetes.io/projected/eeee4610-5faa-46a3-815b-2b04150c9abf-kube-api-access-2jjdq\") pod \"kube-state-metrics-0\" (UID: \"eeee4610-5faa-46a3-815b-2b04150c9abf\") " pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.130256 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.359445 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.382703 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.649478 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.658101 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.759837 4694 generic.go:334] "Generic (PLEG): container finished" podID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerID="4ba76574abc3ec29a756436fc22e2ca7ab118b5cb2a5130e262a1beb285b061e" exitCode=0 Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.759924 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerDied","Data":"4ba76574abc3ec29a756436fc22e2ca7ab118b5cb2a5130e262a1beb285b061e"} Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.763465 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eeee4610-5faa-46a3-815b-2b04150c9abf","Type":"ContainerStarted","Data":"4030ec41ee93a02314804232010bae1ac523901fb7a69c61ce15e45108c596ed"} Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.765517 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerStarted","Data":"3a8da43ee6a50c051fc183d29688a51071e42dffc8e3b245b27b6031181c9e84"} Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.765541 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerStarted","Data":"7e220afbd9aab924989506771011d9a4bc1cd2c46a5c8884a3be341a37db68e5"} Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.783803 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.786884 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.788173 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.788158766 podStartE2EDuration="2.788158766s" podCreationTimestamp="2026-02-17 17:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:48.786586477 +0000 UTC m=+1416.543661821" watchObservedRunningTime="2026-02-17 17:05:48.788158766 +0000 UTC m=+1416.545234110" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.810901 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.810972 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t2d8\" (UniqueName: \"kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.811095 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.811150 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.811198 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.811305 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.811347 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml\") pod \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\" (UID: \"696ed9e7-6ac5-4389-806f-5bc63b0a7412\") " Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.814167 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.814412 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.819384 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8" (OuterVolumeSpecName: "kube-api-access-7t2d8") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "kube-api-access-7t2d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.819999 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts" (OuterVolumeSpecName: "scripts") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.880365 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.906246 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.917981 4694 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.918017 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.918026 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t2d8\" (UniqueName: \"kubernetes.io/projected/696ed9e7-6ac5-4389-806f-5bc63b0a7412-kube-api-access-7t2d8\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.918037 4694 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.918047 4694 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/696ed9e7-6ac5-4389-806f-5bc63b0a7412-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.918056 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.934821 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbcfb3bf-c557-4a38-855e-4be0f77b3ab0" path="/var/lib/kubelet/pods/dbcfb3bf-c557-4a38-855e-4be0f77b3ab0/volumes" Feb 17 17:05:48 crc kubenswrapper[4694]: I0217 17:05:48.965955 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data" (OuterVolumeSpecName: "config-data") pod "696ed9e7-6ac5-4389-806f-5bc63b0a7412" (UID: "696ed9e7-6ac5-4389-806f-5bc63b0a7412"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.024980 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696ed9e7-6ac5-4389-806f-5bc63b0a7412-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036154 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-n6wlz"] Feb 17 17:05:49 crc kubenswrapper[4694]: E0217 17:05:49.036524 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-notification-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036546 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-notification-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: E0217 17:05:49.036582 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="proxy-httpd" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036590 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="proxy-httpd" Feb 17 17:05:49 crc kubenswrapper[4694]: E0217 17:05:49.036621 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-central-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036627 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-central-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: E0217 17:05:49.036642 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="sg-core" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036648 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="sg-core" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036831 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-central-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036851 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="sg-core" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036864 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="proxy-httpd" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.036879 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" containerName="ceilometer-notification-agent" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.037414 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-n6wlz"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.037490 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.042345 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.042667 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.136664 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.140150 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.140198 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q62c\" (UniqueName: \"kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.140293 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.243554 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.243876 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.243900 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q62c\" (UniqueName: \"kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.243955 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.251134 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.251597 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.252128 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.261911 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q62c\" (UniqueName: \"kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c\") pod \"nova-cell1-cell-mapping-n6wlz\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.368933 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.779198 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"696ed9e7-6ac5-4389-806f-5bc63b0a7412","Type":"ContainerDied","Data":"bb8cce7411c2c5d97d4686fd617902943993a5119fc5157c30d7288f8f97be69"} Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.779273 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.779323 4694 scope.go:117] "RemoveContainer" containerID="de5af6f938daf489c589a6bedfcd75049b6d34017f9745a64cc5776f76a36f51" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.781699 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eeee4610-5faa-46a3-815b-2b04150c9abf","Type":"ContainerStarted","Data":"d658dc40adaf9cae7b1379040b85e89f6eb5b337d02f1d56fda7d51742a84bc1"} Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.803996 4694 scope.go:117] "RemoveContainer" containerID="8012ad17611604e8ceb366a7b6a41a98b3b4d6454bc9442b4c8c2b7da3a8cb36" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.811251 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.39553139 podStartE2EDuration="2.811228391s" podCreationTimestamp="2026-02-17 17:05:47 +0000 UTC" firstStartedPulling="2026-02-17 17:05:48.657871396 +0000 UTC m=+1416.414946720" lastFinishedPulling="2026-02-17 17:05:49.073568407 +0000 UTC m=+1416.830643721" observedRunningTime="2026-02-17 17:05:49.807641252 +0000 UTC m=+1417.564716586" watchObservedRunningTime="2026-02-17 17:05:49.811228391 +0000 UTC m=+1417.568303725" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.839831 4694 scope.go:117] "RemoveContainer" containerID="4ba76574abc3ec29a756436fc22e2ca7ab118b5cb2a5130e262a1beb285b061e" Feb 17 17:05:49 crc kubenswrapper[4694]: W0217 17:05:49.847753 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0984ad1_baf2_4fc2_890e_1bd93b726913.slice/crio-75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736 WatchSource:0}: Error finding container 75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736: Status 404 returned error can't find the container with id 75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736 Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.855256 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.869796 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.878230 4694 scope.go:117] "RemoveContainer" containerID="25041799ed886710f6e37c9d2c47409c93f5bd09aee90c18644a31d2bbaf5627" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.902592 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-n6wlz"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.910919 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.913777 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.915971 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.916862 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.917016 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.920027 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.960471 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-config-data\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.960863 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.960967 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.961026 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-scripts\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.961055 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-run-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.961239 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.961269 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4k9\" (UniqueName: \"kubernetes.io/projected/20e5b774-d9a5-4a32-8e29-63543214e090-kube-api-access-kc4k9\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:49 crc kubenswrapper[4694]: I0217 17:05:49.961306 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-log-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063239 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-scripts\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063296 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-run-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063362 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063392 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4k9\" (UniqueName: \"kubernetes.io/projected/20e5b774-d9a5-4a32-8e29-63543214e090-kube-api-access-kc4k9\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063494 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-log-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063535 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-config-data\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063584 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.063645 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.064455 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-run-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.064551 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20e5b774-d9a5-4a32-8e29-63543214e090-log-httpd\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.067726 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-scripts\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.067847 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.067895 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.070139 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.071547 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20e5b774-d9a5-4a32-8e29-63543214e090-config-data\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.083303 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4k9\" (UniqueName: \"kubernetes.io/projected/20e5b774-d9a5-4a32-8e29-63543214e090-kube-api-access-kc4k9\") pod \"ceilometer-0\" (UID: \"20e5b774-d9a5-4a32-8e29-63543214e090\") " pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.353155 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.593126 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.595285 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.606510 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.673797 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.676087 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.676178 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.676275 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjp2t\" (UniqueName: \"kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.742221 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.742503 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="dnsmasq-dns" containerID="cri-o://a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83" gracePeriod=10 Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.779497 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.779757 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.779926 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjp2t\" (UniqueName: \"kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.780257 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.780587 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.803851 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n6wlz" event={"ID":"e0984ad1-baf2-4fc2-890e-1bd93b726913","Type":"ContainerStarted","Data":"52aba373abcffee22cf8260d1bd960ebba0cd09e3d539084d88680beccfb94d4"} Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.803906 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n6wlz" event={"ID":"e0984ad1-baf2-4fc2-890e-1bd93b726913","Type":"ContainerStarted","Data":"75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736"} Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.804135 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.808347 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjp2t\" (UniqueName: \"kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t\") pod \"redhat-operators-vh5xg\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.836243 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-n6wlz" podStartSLOduration=2.836224092 podStartE2EDuration="2.836224092s" podCreationTimestamp="2026-02-17 17:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:05:50.830669115 +0000 UTC m=+1418.587744439" watchObservedRunningTime="2026-02-17 17:05:50.836224092 +0000 UTC m=+1418.593299416" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.858426 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.908752 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696ed9e7-6ac5-4389-806f-5bc63b0a7412" path="/var/lib/kubelet/pods/696ed9e7-6ac5-4389-806f-5bc63b0a7412/volumes" Feb 17 17:05:50 crc kubenswrapper[4694]: I0217 17:05:50.934524 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.384635 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.496689 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.496980 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.497058 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.497089 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.497172 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-758ff\" (UniqueName: \"kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.497216 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc\") pod \"a12e4044-ba57-433d-9418-1a335dba1f0c\" (UID: \"a12e4044-ba57-433d-9418-1a335dba1f0c\") " Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.502105 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff" (OuterVolumeSpecName: "kube-api-access-758ff") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "kube-api-access-758ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.567578 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.567816 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:05:51 crc kubenswrapper[4694]: W0217 17:05:51.579427 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2 WatchSource:0}: Error finding container 09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2: Status 404 returned error can't find the container with id 09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2 Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.599597 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.599649 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-758ff\" (UniqueName: \"kubernetes.io/projected/a12e4044-ba57-433d-9418-1a335dba1f0c-kube-api-access-758ff\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.617515 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.635118 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config" (OuterVolumeSpecName: "config") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.636894 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.640022 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a12e4044-ba57-433d-9418-1a335dba1f0c" (UID: "a12e4044-ba57-433d-9418-1a335dba1f0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.701817 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.701863 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.701875 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.701885 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12e4044-ba57-433d-9418-1a335dba1f0c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.812691 4694 generic.go:334] "Generic (PLEG): container finished" podID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerID="a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83" exitCode=0 Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.812751 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.812763 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" event={"ID":"a12e4044-ba57-433d-9418-1a335dba1f0c","Type":"ContainerDied","Data":"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.812810 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-8wkzq" event={"ID":"a12e4044-ba57-433d-9418-1a335dba1f0c","Type":"ContainerDied","Data":"74260c4cdbe12c0854fe0712e2da1df50b1811d14e09ed846146c876fcd686a5"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.812832 4694 scope.go:117] "RemoveContainer" containerID="a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.814505 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20e5b774-d9a5-4a32-8e29-63543214e090","Type":"ContainerStarted","Data":"012a62805312388d3a7efacfdad5bc154a0aa856d76da66a9de9ad3a72afab67"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.814530 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20e5b774-d9a5-4a32-8e29-63543214e090","Type":"ContainerStarted","Data":"6362d962aa15d9b940c69b8be5d95894674f3ddbcf2293ec5243c7f2cf2e7b40"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.816220 4694 generic.go:334] "Generic (PLEG): container finished" podID="24261cc7-a023-431d-bc69-8d8009b41a03" containerID="230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf" exitCode=0 Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.816303 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerDied","Data":"230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.816337 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerStarted","Data":"09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2"} Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.832206 4694 scope.go:117] "RemoveContainer" containerID="f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.873099 4694 scope.go:117] "RemoveContainer" containerID="a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83" Feb 17 17:05:51 crc kubenswrapper[4694]: E0217 17:05:51.873571 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83\": container with ID starting with a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83 not found: ID does not exist" containerID="a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.873600 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83"} err="failed to get container status \"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83\": rpc error: code = NotFound desc = could not find container \"a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83\": container with ID starting with a13c0ff50078921c11b6772bd15fed627fdda67a798f5c0c1e66774b2595fb83 not found: ID does not exist" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.873633 4694 scope.go:117] "RemoveContainer" containerID="f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db" Feb 17 17:05:51 crc kubenswrapper[4694]: E0217 17:05:51.873984 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db\": container with ID starting with f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db not found: ID does not exist" containerID="f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.874005 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db"} err="failed to get container status \"f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db\": rpc error: code = NotFound desc = could not find container \"f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db\": container with ID starting with f759d14dff7bc421e24e1f296ec41393c8ed01395901c96c1aee0a89c4c7f6db not found: ID does not exist" Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.876382 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:51 crc kubenswrapper[4694]: I0217 17:05:51.890161 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-8wkzq"] Feb 17 17:05:52 crc kubenswrapper[4694]: I0217 17:05:52.828394 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20e5b774-d9a5-4a32-8e29-63543214e090","Type":"ContainerStarted","Data":"d59d7e25e277455c804aa9f826a3d1eef2ba769db6aed23b508691114d695753"} Feb 17 17:05:52 crc kubenswrapper[4694]: I0217 17:05:52.830454 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerStarted","Data":"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d"} Feb 17 17:05:52 crc kubenswrapper[4694]: I0217 17:05:52.907022 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" path="/var/lib/kubelet/pods/a12e4044-ba57-433d-9418-1a335dba1f0c/volumes" Feb 17 17:05:53 crc kubenswrapper[4694]: I0217 17:05:53.842401 4694 generic.go:334] "Generic (PLEG): container finished" podID="24261cc7-a023-431d-bc69-8d8009b41a03" containerID="50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d" exitCode=0 Feb 17 17:05:53 crc kubenswrapper[4694]: I0217 17:05:53.842750 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerDied","Data":"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d"} Feb 17 17:05:53 crc kubenswrapper[4694]: I0217 17:05:53.847638 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20e5b774-d9a5-4a32-8e29-63543214e090","Type":"ContainerStarted","Data":"2caae34c376adbb58df0002c4f742ff965e44dd1f9f3ff4c916cfceb4c0c6e54"} Feb 17 17:05:54 crc kubenswrapper[4694]: I0217 17:05:54.864892 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20e5b774-d9a5-4a32-8e29-63543214e090","Type":"ContainerStarted","Data":"262f4a4e1a6d789205d5982e4554c7da8cef488edeef576c0f41b6010a319b48"} Feb 17 17:05:54 crc kubenswrapper[4694]: I0217 17:05:54.866740 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 17:05:54 crc kubenswrapper[4694]: I0217 17:05:54.896688 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.478665298 podStartE2EDuration="5.896661696s" podCreationTimestamp="2026-02-17 17:05:49 +0000 UTC" firstStartedPulling="2026-02-17 17:05:50.851519819 +0000 UTC m=+1418.608595143" lastFinishedPulling="2026-02-17 17:05:54.269516217 +0000 UTC m=+1422.026591541" observedRunningTime="2026-02-17 17:05:54.886462065 +0000 UTC m=+1422.643537389" watchObservedRunningTime="2026-02-17 17:05:54.896661696 +0000 UTC m=+1422.653737020" Feb 17 17:05:55 crc kubenswrapper[4694]: I0217 17:05:55.874716 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerStarted","Data":"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808"} Feb 17 17:05:55 crc kubenswrapper[4694]: I0217 17:05:55.897354 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vh5xg" podStartSLOduration=2.766813265 podStartE2EDuration="5.89733773s" podCreationTimestamp="2026-02-17 17:05:50 +0000 UTC" firstStartedPulling="2026-02-17 17:05:51.817939968 +0000 UTC m=+1419.575015292" lastFinishedPulling="2026-02-17 17:05:54.948464433 +0000 UTC m=+1422.705539757" observedRunningTime="2026-02-17 17:05:55.894050489 +0000 UTC m=+1423.651125813" watchObservedRunningTime="2026-02-17 17:05:55.89733773 +0000 UTC m=+1423.654413054" Feb 17 17:05:56 crc kubenswrapper[4694]: I0217 17:05:56.883844 4694 generic.go:334] "Generic (PLEG): container finished" podID="e0984ad1-baf2-4fc2-890e-1bd93b726913" containerID="52aba373abcffee22cf8260d1bd960ebba0cd09e3d539084d88680beccfb94d4" exitCode=0 Feb 17 17:05:56 crc kubenswrapper[4694]: I0217 17:05:56.883890 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n6wlz" event={"ID":"e0984ad1-baf2-4fc2-890e-1bd93b726913","Type":"ContainerDied","Data":"52aba373abcffee22cf8260d1bd960ebba0cd09e3d539084d88680beccfb94d4"} Feb 17 17:05:57 crc kubenswrapper[4694]: I0217 17:05:57.156331 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:57 crc kubenswrapper[4694]: I0217 17:05:57.156719 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.170949 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.171357 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.185941 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.282451 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.324963 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q62c\" (UniqueName: \"kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c\") pod \"e0984ad1-baf2-4fc2-890e-1bd93b726913\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.325038 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle\") pod \"e0984ad1-baf2-4fc2-890e-1bd93b726913\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.325104 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts\") pod \"e0984ad1-baf2-4fc2-890e-1bd93b726913\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.325176 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data\") pod \"e0984ad1-baf2-4fc2-890e-1bd93b726913\" (UID: \"e0984ad1-baf2-4fc2-890e-1bd93b726913\") " Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.331741 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts" (OuterVolumeSpecName: "scripts") pod "e0984ad1-baf2-4fc2-890e-1bd93b726913" (UID: "e0984ad1-baf2-4fc2-890e-1bd93b726913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.335881 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c" (OuterVolumeSpecName: "kube-api-access-5q62c") pod "e0984ad1-baf2-4fc2-890e-1bd93b726913" (UID: "e0984ad1-baf2-4fc2-890e-1bd93b726913"). InnerVolumeSpecName "kube-api-access-5q62c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.353945 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0984ad1-baf2-4fc2-890e-1bd93b726913" (UID: "e0984ad1-baf2-4fc2-890e-1bd93b726913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.354404 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data" (OuterVolumeSpecName: "config-data") pod "e0984ad1-baf2-4fc2-890e-1bd93b726913" (UID: "e0984ad1-baf2-4fc2-890e-1bd93b726913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.427187 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.427218 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q62c\" (UniqueName: \"kubernetes.io/projected/e0984ad1-baf2-4fc2-890e-1bd93b726913-kube-api-access-5q62c\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.427230 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.427239 4694 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0984ad1-baf2-4fc2-890e-1bd93b726913-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.907164 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-n6wlz" event={"ID":"e0984ad1-baf2-4fc2-890e-1bd93b726913","Type":"ContainerDied","Data":"75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736"} Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.907202 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75ef2e80524e051a2a2d40533dd6c1ef21ded629daab01383b4ecb53fa378736" Feb 17 17:05:58 crc kubenswrapper[4694]: I0217 17:05:58.907233 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-n6wlz" Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.180680 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.181937 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-log" containerID="cri-o://7e220afbd9aab924989506771011d9a4bc1cd2c46a5c8884a3be341a37db68e5" gracePeriod=30 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.182711 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-api" containerID="cri-o://3a8da43ee6a50c051fc183d29688a51071e42dffc8e3b245b27b6031181c9e84" gracePeriod=30 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.201756 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.202353 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerName="nova-scheduler-scheduler" containerID="cri-o://91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" gracePeriod=30 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.216755 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.216987 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" containerID="cri-o://b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b" gracePeriod=30 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.217139 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" containerID="cri-o://854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d" gracePeriod=30 Feb 17 17:05:59 crc kubenswrapper[4694]: E0217 17:05:59.905016 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:59 crc kubenswrapper[4694]: E0217 17:05:59.908354 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:59 crc kubenswrapper[4694]: E0217 17:05:59.931077 4694 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 17:05:59 crc kubenswrapper[4694]: E0217 17:05:59.931166 4694 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerName="nova-scheduler-scheduler" Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.933421 4694 generic.go:334] "Generic (PLEG): container finished" podID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerID="b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b" exitCode=143 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.933524 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerDied","Data":"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b"} Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.944543 4694 generic.go:334] "Generic (PLEG): container finished" podID="1a32e407-e92e-4b92-a606-13e432972428" containerID="7e220afbd9aab924989506771011d9a4bc1cd2c46a5c8884a3be341a37db68e5" exitCode=143 Feb 17 17:05:59 crc kubenswrapper[4694]: I0217 17:05:59.944587 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerDied","Data":"7e220afbd9aab924989506771011d9a4bc1cd2c46a5c8884a3be341a37db68e5"} Feb 17 17:06:00 crc kubenswrapper[4694]: I0217 17:06:00.934718 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:00 crc kubenswrapper[4694]: I0217 17:06:00.935182 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:01 crc kubenswrapper[4694]: I0217 17:06:01.980211 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vh5xg" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="registry-server" probeResult="failure" output=< Feb 17 17:06:01 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 17:06:01 crc kubenswrapper[4694]: > Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.354635 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:47984->10.217.0.197:8775: read: connection reset by peer" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.354694 4694 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:47968->10.217.0.197:8775: read: connection reset by peer" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.817914 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.913402 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs\") pod \"143d7405-b363-476c-948d-a8fdb7cbbe5d\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.913446 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs\") pod \"143d7405-b363-476c-948d-a8fdb7cbbe5d\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.913618 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle\") pod \"143d7405-b363-476c-948d-a8fdb7cbbe5d\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.913689 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lspwq\" (UniqueName: \"kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq\") pod \"143d7405-b363-476c-948d-a8fdb7cbbe5d\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.913734 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data\") pod \"143d7405-b363-476c-948d-a8fdb7cbbe5d\" (UID: \"143d7405-b363-476c-948d-a8fdb7cbbe5d\") " Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.914905 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs" (OuterVolumeSpecName: "logs") pod "143d7405-b363-476c-948d-a8fdb7cbbe5d" (UID: "143d7405-b363-476c-948d-a8fdb7cbbe5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.919132 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq" (OuterVolumeSpecName: "kube-api-access-lspwq") pod "143d7405-b363-476c-948d-a8fdb7cbbe5d" (UID: "143d7405-b363-476c-948d-a8fdb7cbbe5d"). InnerVolumeSpecName "kube-api-access-lspwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.947473 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data" (OuterVolumeSpecName: "config-data") pod "143d7405-b363-476c-948d-a8fdb7cbbe5d" (UID: "143d7405-b363-476c-948d-a8fdb7cbbe5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.950195 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "143d7405-b363-476c-948d-a8fdb7cbbe5d" (UID: "143d7405-b363-476c-948d-a8fdb7cbbe5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.971824 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "143d7405-b363-476c-948d-a8fdb7cbbe5d" (UID: "143d7405-b363-476c-948d-a8fdb7cbbe5d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.975518 4694 generic.go:334] "Generic (PLEG): container finished" podID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerID="854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d" exitCode=0 Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.975562 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerDied","Data":"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d"} Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.975588 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"143d7405-b363-476c-948d-a8fdb7cbbe5d","Type":"ContainerDied","Data":"f25e25af98c5bf0ed5b64f7c689c4db5cb643b4ac04211d49ed6353d96dd7b6f"} Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.975605 4694 scope.go:117] "RemoveContainer" containerID="854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d" Feb 17 17:06:02 crc kubenswrapper[4694]: I0217 17:06:02.975800 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.018191 4694 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.018217 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/143d7405-b363-476c-948d-a8fdb7cbbe5d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.018230 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.018242 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lspwq\" (UniqueName: \"kubernetes.io/projected/143d7405-b363-476c-948d-a8fdb7cbbe5d-kube-api-access-lspwq\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.018253 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143d7405-b363-476c-948d-a8fdb7cbbe5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.039222 4694 scope.go:117] "RemoveContainer" containerID="b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.044994 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.058679 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.080775 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.081648 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.081746 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.081824 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.081889 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.081975 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="init" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.082043 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="init" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.082124 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="dnsmasq-dns" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.082211 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="dnsmasq-dns" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.082307 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0984ad1-baf2-4fc2-890e-1bd93b726913" containerName="nova-manage" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.082383 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0984ad1-baf2-4fc2-890e-1bd93b726913" containerName="nova-manage" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.086084 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12e4044-ba57-433d-9418-1a335dba1f0c" containerName="dnsmasq-dns" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.086120 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-metadata" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.086141 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0984ad1-baf2-4fc2-890e-1bd93b726913" containerName="nova-manage" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.086154 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" containerName="nova-metadata-log" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.087137 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.089118 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.089699 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.090784 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.099063 4694 scope.go:117] "RemoveContainer" containerID="854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.100030 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d\": container with ID starting with 854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d not found: ID does not exist" containerID="854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.100103 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d"} err="failed to get container status \"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d\": rpc error: code = NotFound desc = could not find container \"854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d\": container with ID starting with 854ff8df28e05e146e45efd06b46b23416612df689439635a236ec19f435d56d not found: ID does not exist" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.100128 4694 scope.go:117] "RemoveContainer" containerID="b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b" Feb 17 17:06:03 crc kubenswrapper[4694]: E0217 17:06:03.100376 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b\": container with ID starting with b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b not found: ID does not exist" containerID="b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.100429 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b"} err="failed to get container status \"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b\": rpc error: code = NotFound desc = could not find container \"b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b\": container with ID starting with b0f1ccae30a40fab7a77a9d5065798aaa06dbe3d9e7e233bd624f2ef3da3b23b not found: ID does not exist" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.120451 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.120553 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.120637 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f396879b-c24b-478f-b98f-24347a13a36d-logs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.120729 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8tt\" (UniqueName: \"kubernetes.io/projected/f396879b-c24b-478f-b98f-24347a13a36d-kube-api-access-jd8tt\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.120815 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-config-data\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222399 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222483 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222558 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f396879b-c24b-478f-b98f-24347a13a36d-logs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222592 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8tt\" (UniqueName: \"kubernetes.io/projected/f396879b-c24b-478f-b98f-24347a13a36d-kube-api-access-jd8tt\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222650 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-config-data\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.222984 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f396879b-c24b-478f-b98f-24347a13a36d-logs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.226495 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.227590 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.227791 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f396879b-c24b-478f-b98f-24347a13a36d-config-data\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.241143 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8tt\" (UniqueName: \"kubernetes.io/projected/f396879b-c24b-478f-b98f-24347a13a36d-kube-api-access-jd8tt\") pod \"nova-metadata-0\" (UID: \"f396879b-c24b-478f-b98f-24347a13a36d\") " pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.413346 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.853649 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 17:06:03 crc kubenswrapper[4694]: W0217 17:06:03.855180 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf396879b_c24b_478f_b98f_24347a13a36d.slice/crio-0974b9a4e955a3c7524fa6a3bfa4733bd6f26c76214d16b9e347b17aca584a86 WatchSource:0}: Error finding container 0974b9a4e955a3c7524fa6a3bfa4733bd6f26c76214d16b9e347b17aca584a86: Status 404 returned error can't find the container with id 0974b9a4e955a3c7524fa6a3bfa4733bd6f26c76214d16b9e347b17aca584a86 Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.986088 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f396879b-c24b-478f-b98f-24347a13a36d","Type":"ContainerStarted","Data":"0974b9a4e955a3c7524fa6a3bfa4733bd6f26c76214d16b9e347b17aca584a86"} Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.993473 4694 generic.go:334] "Generic (PLEG): container finished" podID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerID="91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" exitCode=0 Feb 17 17:06:03 crc kubenswrapper[4694]: I0217 17:06:03.993541 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8475e1ad-422c-4ae0-9230-052dc4e0e8fc","Type":"ContainerDied","Data":"91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b"} Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.001316 4694 generic.go:334] "Generic (PLEG): container finished" podID="1a32e407-e92e-4b92-a606-13e432972428" containerID="3a8da43ee6a50c051fc183d29688a51071e42dffc8e3b245b27b6031181c9e84" exitCode=0 Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.001362 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerDied","Data":"3a8da43ee6a50c051fc183d29688a51071e42dffc8e3b245b27b6031181c9e84"} Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.040044 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.097328 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143273 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143440 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-445z7\" (UniqueName: \"kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143512 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143560 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143590 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.143633 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle\") pod \"1a32e407-e92e-4b92-a606-13e432972428\" (UID: \"1a32e407-e92e-4b92-a606-13e432972428\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.149665 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs" (OuterVolumeSpecName: "logs") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.158290 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7" (OuterVolumeSpecName: "kube-api-access-445z7") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "kube-api-access-445z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.181679 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.197501 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data" (OuterVolumeSpecName: "config-data") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.234353 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.234652 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a32e407-e92e-4b92-a606-13e432972428" (UID: "1a32e407-e92e-4b92-a606-13e432972428"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.245717 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c45zf\" (UniqueName: \"kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf\") pod \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.246165 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle\") pod \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.246304 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data\") pod \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\" (UID: \"8475e1ad-422c-4ae0-9230-052dc4e0e8fc\") " Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247439 4694 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a32e407-e92e-4b92-a606-13e432972428-logs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247546 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-445z7\" (UniqueName: \"kubernetes.io/projected/1a32e407-e92e-4b92-a606-13e432972428-kube-api-access-445z7\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247664 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247746 4694 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247839 4694 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.247915 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a32e407-e92e-4b92-a606-13e432972428-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.249017 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf" (OuterVolumeSpecName: "kube-api-access-c45zf") pod "8475e1ad-422c-4ae0-9230-052dc4e0e8fc" (UID: "8475e1ad-422c-4ae0-9230-052dc4e0e8fc"). InnerVolumeSpecName "kube-api-access-c45zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.269353 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data" (OuterVolumeSpecName: "config-data") pod "8475e1ad-422c-4ae0-9230-052dc4e0e8fc" (UID: "8475e1ad-422c-4ae0-9230-052dc4e0e8fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.271342 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8475e1ad-422c-4ae0-9230-052dc4e0e8fc" (UID: "8475e1ad-422c-4ae0-9230-052dc4e0e8fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.348502 4694 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.348540 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.348554 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c45zf\" (UniqueName: \"kubernetes.io/projected/8475e1ad-422c-4ae0-9230-052dc4e0e8fc-kube-api-access-c45zf\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:04 crc kubenswrapper[4694]: I0217 17:06:04.904954 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143d7405-b363-476c-948d-a8fdb7cbbe5d" path="/var/lib/kubelet/pods/143d7405-b363-476c-948d-a8fdb7cbbe5d/volumes" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.013430 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a32e407-e92e-4b92-a606-13e432972428","Type":"ContainerDied","Data":"7ea7126c84b25fc479078804145e8bfae18efcbf86897b09834abfa0b6c5a1c9"} Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.013516 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.013538 4694 scope.go:117] "RemoveContainer" containerID="3a8da43ee6a50c051fc183d29688a51071e42dffc8e3b245b27b6031181c9e84" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.022400 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f396879b-c24b-478f-b98f-24347a13a36d","Type":"ContainerStarted","Data":"c94297f1a8949026e84c53efe54e1f095dcf8147e474ce3a871dd7cb057a0461"} Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.022460 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f396879b-c24b-478f-b98f-24347a13a36d","Type":"ContainerStarted","Data":"83f9fd36db32a32b6639a1a708be80b1ba2bb84eaef3282bd9d64a23aa982b0b"} Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.025589 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8475e1ad-422c-4ae0-9230-052dc4e0e8fc","Type":"ContainerDied","Data":"a49985fb6964b7688d0cf3cbfbe21f3aa3f45e91544533c2ab03e45a518c776a"} Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.025703 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.044967 4694 scope.go:117] "RemoveContainer" containerID="7e220afbd9aab924989506771011d9a4bc1cd2c46a5c8884a3be341a37db68e5" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.100460 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.100435042 podStartE2EDuration="2.100435042s" podCreationTimestamp="2026-02-17 17:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:06:05.045855546 +0000 UTC m=+1432.802930870" watchObservedRunningTime="2026-02-17 17:06:05.100435042 +0000 UTC m=+1432.857510366" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.117832 4694 scope.go:117] "RemoveContainer" containerID="91df459a6adfdd8e9c8812b05a5b47c87be5738fafee47fe835c67d3f83e5c6b" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.149936 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.162716 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.175713 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: E0217 17:06:05.176335 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-api" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176358 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-api" Feb 17 17:06:05 crc kubenswrapper[4694]: E0217 17:06:05.176393 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerName="nova-scheduler-scheduler" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176403 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerName="nova-scheduler-scheduler" Feb 17 17:06:05 crc kubenswrapper[4694]: E0217 17:06:05.176496 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-log" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176517 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-log" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176933 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-api" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176969 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a32e407-e92e-4b92-a606-13e432972428" containerName="nova-api-log" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.176982 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" containerName="nova-scheduler-scheduler" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.178772 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: E0217 17:06:05.179046 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8475e1ad_422c_4ae0_9230_052dc4e0e8fc.slice/crio-a49985fb6964b7688d0cf3cbfbe21f3aa3f45e91544533c2ab03e45a518c776a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8475e1ad_422c_4ae0_9230_052dc4e0e8fc.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.181005 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.182655 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.183319 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.188234 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.204707 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.214106 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.222542 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.224582 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.226822 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.234542 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.273431 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkg6\" (UniqueName: \"kubernetes.io/projected/a8083aa6-ef30-42ca-b979-e21a0697ce79-kube-api-access-skkg6\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.273498 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-config-data\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.273671 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8083aa6-ef30-42ca-b979-e21a0697ce79-logs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.273911 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.274052 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.274121 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376077 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkg6\" (UniqueName: \"kubernetes.io/projected/a8083aa6-ef30-42ca-b979-e21a0697ce79-kube-api-access-skkg6\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376138 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qnll\" (UniqueName: \"kubernetes.io/projected/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-kube-api-access-7qnll\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376183 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-config-data\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376208 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8083aa6-ef30-42ca-b979-e21a0697ce79-logs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376361 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376525 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376640 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-config-data\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376740 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376783 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8083aa6-ef30-42ca-b979-e21a0697ce79-logs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.376844 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.381755 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.382762 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-config-data\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.383207 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.384379 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8083aa6-ef30-42ca-b979-e21a0697ce79-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.395542 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkg6\" (UniqueName: \"kubernetes.io/projected/a8083aa6-ef30-42ca-b979-e21a0697ce79-kube-api-access-skkg6\") pod \"nova-api-0\" (UID: \"a8083aa6-ef30-42ca-b979-e21a0697ce79\") " pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.479019 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qnll\" (UniqueName: \"kubernetes.io/projected/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-kube-api-access-7qnll\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.479176 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.479238 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-config-data\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.483409 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-config-data\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.483476 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.504954 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qnll\" (UniqueName: \"kubernetes.io/projected/204bee86-a2fa-4fb3-bb90-60f67cb66bc7-kube-api-access-7qnll\") pod \"nova-scheduler-0\" (UID: \"204bee86-a2fa-4fb3-bb90-60f67cb66bc7\") " pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.506915 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.553134 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 17:06:05 crc kubenswrapper[4694]: I0217 17:06:05.962696 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 17:06:06 crc kubenswrapper[4694]: I0217 17:06:06.043839 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 17:06:06 crc kubenswrapper[4694]: I0217 17:06:06.045312 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8083aa6-ef30-42ca-b979-e21a0697ce79","Type":"ContainerStarted","Data":"96914ebefdab6c973efb7b56b1e339e3ab1ba7e8f372b97872e3e01c4eccf060"} Feb 17 17:06:06 crc kubenswrapper[4694]: W0217 17:06:06.045950 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204bee86_a2fa_4fb3_bb90_60f67cb66bc7.slice/crio-13b080853cd8f7c30cdc36f4284aa9115f52b6addb15eb6ec6d3b5a0fce8885a WatchSource:0}: Error finding container 13b080853cd8f7c30cdc36f4284aa9115f52b6addb15eb6ec6d3b5a0fce8885a: Status 404 returned error can't find the container with id 13b080853cd8f7c30cdc36f4284aa9115f52b6addb15eb6ec6d3b5a0fce8885a Feb 17 17:06:06 crc kubenswrapper[4694]: I0217 17:06:06.907716 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a32e407-e92e-4b92-a606-13e432972428" path="/var/lib/kubelet/pods/1a32e407-e92e-4b92-a606-13e432972428/volumes" Feb 17 17:06:06 crc kubenswrapper[4694]: I0217 17:06:06.914526 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8475e1ad-422c-4ae0-9230-052dc4e0e8fc" path="/var/lib/kubelet/pods/8475e1ad-422c-4ae0-9230-052dc4e0e8fc/volumes" Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.057822 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"204bee86-a2fa-4fb3-bb90-60f67cb66bc7","Type":"ContainerStarted","Data":"209248bfeb42091acb29dcce535b17d23741b4ef566909ecc5e22199b742dc5e"} Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.058149 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"204bee86-a2fa-4fb3-bb90-60f67cb66bc7","Type":"ContainerStarted","Data":"13b080853cd8f7c30cdc36f4284aa9115f52b6addb15eb6ec6d3b5a0fce8885a"} Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.061319 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8083aa6-ef30-42ca-b979-e21a0697ce79","Type":"ContainerStarted","Data":"b836e7d31c71e99d6b818f9780e58fe1e0ee53876ef44aed84ab8c58e6d2795e"} Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.061470 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8083aa6-ef30-42ca-b979-e21a0697ce79","Type":"ContainerStarted","Data":"3a76e5e7692b4e53a50546b6740b356baa0ca94738c242ce852db0667b42e145"} Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.085076 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.085041879 podStartE2EDuration="2.085041879s" podCreationTimestamp="2026-02-17 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:06:07.081747488 +0000 UTC m=+1434.838822892" watchObservedRunningTime="2026-02-17 17:06:07.085041879 +0000 UTC m=+1434.842117243" Feb 17 17:06:07 crc kubenswrapper[4694]: I0217 17:06:07.116957 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.116935446 podStartE2EDuration="2.116935446s" podCreationTimestamp="2026-02-17 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:06:07.101438353 +0000 UTC m=+1434.858513687" watchObservedRunningTime="2026-02-17 17:06:07.116935446 +0000 UTC m=+1434.874010780" Feb 17 17:06:08 crc kubenswrapper[4694]: I0217 17:06:08.414203 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:06:08 crc kubenswrapper[4694]: I0217 17:06:08.414552 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 17:06:10 crc kubenswrapper[4694]: I0217 17:06:10.553997 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 17:06:10 crc kubenswrapper[4694]: I0217 17:06:10.994367 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:11 crc kubenswrapper[4694]: I0217 17:06:11.063700 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:11 crc kubenswrapper[4694]: I0217 17:06:11.228823 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.103326 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vh5xg" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="registry-server" containerID="cri-o://ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808" gracePeriod=2 Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.587265 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.738207 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjp2t\" (UniqueName: \"kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t\") pod \"24261cc7-a023-431d-bc69-8d8009b41a03\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.738383 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities\") pod \"24261cc7-a023-431d-bc69-8d8009b41a03\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.738871 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content\") pod \"24261cc7-a023-431d-bc69-8d8009b41a03\" (UID: \"24261cc7-a023-431d-bc69-8d8009b41a03\") " Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.739490 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities" (OuterVolumeSpecName: "utilities") pod "24261cc7-a023-431d-bc69-8d8009b41a03" (UID: "24261cc7-a023-431d-bc69-8d8009b41a03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.744917 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t" (OuterVolumeSpecName: "kube-api-access-fjp2t") pod "24261cc7-a023-431d-bc69-8d8009b41a03" (UID: "24261cc7-a023-431d-bc69-8d8009b41a03"). InnerVolumeSpecName "kube-api-access-fjp2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.841207 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjp2t\" (UniqueName: \"kubernetes.io/projected/24261cc7-a023-431d-bc69-8d8009b41a03-kube-api-access-fjp2t\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.841542 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.860830 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24261cc7-a023-431d-bc69-8d8009b41a03" (UID: "24261cc7-a023-431d-bc69-8d8009b41a03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:12 crc kubenswrapper[4694]: I0217 17:06:12.943141 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24261cc7-a023-431d-bc69-8d8009b41a03-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.117963 4694 generic.go:334] "Generic (PLEG): container finished" podID="24261cc7-a023-431d-bc69-8d8009b41a03" containerID="ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808" exitCode=0 Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.118008 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerDied","Data":"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808"} Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.118048 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vh5xg" event={"ID":"24261cc7-a023-431d-bc69-8d8009b41a03","Type":"ContainerDied","Data":"09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2"} Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.118070 4694 scope.go:117] "RemoveContainer" containerID="ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.118083 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vh5xg" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.143991 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.151387 4694 scope.go:117] "RemoveContainer" containerID="50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.151839 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vh5xg"] Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.181058 4694 scope.go:117] "RemoveContainer" containerID="230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.217913 4694 scope.go:117] "RemoveContainer" containerID="ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808" Feb 17 17:06:13 crc kubenswrapper[4694]: E0217 17:06:13.218426 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808\": container with ID starting with ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808 not found: ID does not exist" containerID="ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.218465 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808"} err="failed to get container status \"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808\": rpc error: code = NotFound desc = could not find container \"ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808\": container with ID starting with ec3bb2d3ddc0c661d3239a3d07a0f2793bb447c42c313b237419c0c6807ae808 not found: ID does not exist" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.218492 4694 scope.go:117] "RemoveContainer" containerID="50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d" Feb 17 17:06:13 crc kubenswrapper[4694]: E0217 17:06:13.218958 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d\": container with ID starting with 50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d not found: ID does not exist" containerID="50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.218997 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d"} err="failed to get container status \"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d\": rpc error: code = NotFound desc = could not find container \"50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d\": container with ID starting with 50fc72c1a99665b8fb9453acd7ea8a78f97dc01eac1aec44b10a38472df5f50d not found: ID does not exist" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.219023 4694 scope.go:117] "RemoveContainer" containerID="230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf" Feb 17 17:06:13 crc kubenswrapper[4694]: E0217 17:06:13.219268 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf\": container with ID starting with 230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf not found: ID does not exist" containerID="230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.219296 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf"} err="failed to get container status \"230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf\": rpc error: code = NotFound desc = could not find container \"230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf\": container with ID starting with 230bdab994ff08edcbf43dfcc86188c5e1000f0a1f9e19224898b104d630ddcf not found: ID does not exist" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.414050 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 17:06:13 crc kubenswrapper[4694]: I0217 17:06:13.414436 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.461844 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f396879b-c24b-478f-b98f-24347a13a36d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.461872 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f396879b-c24b-478f-b98f-24347a13a36d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.636675 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:14 crc kubenswrapper[4694]: E0217 17:06:14.637032 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="extract-utilities" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.637048 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="extract-utilities" Feb 17 17:06:14 crc kubenswrapper[4694]: E0217 17:06:14.637063 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="extract-content" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.637070 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="extract-content" Feb 17 17:06:14 crc kubenswrapper[4694]: E0217 17:06:14.637085 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="registry-server" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.637092 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="registry-server" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.637266 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" containerName="registry-server" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.638468 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.657512 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.778268 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.778380 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.778409 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkwzs\" (UniqueName: \"kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.879870 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.879929 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkwzs\" (UniqueName: \"kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.880087 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.880344 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.880487 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.903471 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkwzs\" (UniqueName: \"kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs\") pod \"community-operators-n445k\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.916508 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24261cc7-a023-431d-bc69-8d8009b41a03" path="/var/lib/kubelet/pods/24261cc7-a023-431d-bc69-8d8009b41a03/volumes" Feb 17 17:06:14 crc kubenswrapper[4694]: I0217 17:06:14.956748 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:15 crc kubenswrapper[4694]: E0217 17:06:15.418796 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:15 crc kubenswrapper[4694]: I0217 17:06:15.507477 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:06:15 crc kubenswrapper[4694]: I0217 17:06:15.507928 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 17:06:15 crc kubenswrapper[4694]: I0217 17:06:15.541425 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:15 crc kubenswrapper[4694]: I0217 17:06:15.554229 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 17:06:15 crc kubenswrapper[4694]: I0217 17:06:15.594001 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.144534 4694 generic.go:334] "Generic (PLEG): container finished" podID="66856e01-6a23-43ab-9abb-eac237a7a192" containerID="faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522" exitCode=0 Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.144628 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerDied","Data":"faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522"} Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.144921 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerStarted","Data":"eb6205315e1af5a1650770f84a098b1d2d5472b24ae125a7fecc0fa67c54b922"} Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.178317 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.519735 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8083aa6-ef30-42ca-b979-e21a0697ce79" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:06:16 crc kubenswrapper[4694]: I0217 17:06:16.519876 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8083aa6-ef30-42ca-b979-e21a0697ce79" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 17:06:17 crc kubenswrapper[4694]: I0217 17:06:17.155103 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerStarted","Data":"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301"} Feb 17 17:06:18 crc kubenswrapper[4694]: I0217 17:06:18.172778 4694 generic.go:334] "Generic (PLEG): container finished" podID="66856e01-6a23-43ab-9abb-eac237a7a192" containerID="b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301" exitCode=0 Feb 17 17:06:18 crc kubenswrapper[4694]: I0217 17:06:18.172826 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerDied","Data":"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301"} Feb 17 17:06:19 crc kubenswrapper[4694]: I0217 17:06:19.185155 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerStarted","Data":"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb"} Feb 17 17:06:19 crc kubenswrapper[4694]: I0217 17:06:19.207000 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n445k" podStartSLOduration=2.7624996509999997 podStartE2EDuration="5.206979201s" podCreationTimestamp="2026-02-17 17:06:14 +0000 UTC" firstStartedPulling="2026-02-17 17:06:16.146326023 +0000 UTC m=+1443.903401347" lastFinishedPulling="2026-02-17 17:06:18.590805533 +0000 UTC m=+1446.347880897" observedRunningTime="2026-02-17 17:06:19.20209374 +0000 UTC m=+1446.959169084" watchObservedRunningTime="2026-02-17 17:06:19.206979201 +0000 UTC m=+1446.964054525" Feb 17 17:06:20 crc kubenswrapper[4694]: I0217 17:06:20.370368 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 17:06:23 crc kubenswrapper[4694]: I0217 17:06:23.420191 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 17:06:23 crc kubenswrapper[4694]: I0217 17:06:23.420771 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 17:06:23 crc kubenswrapper[4694]: I0217 17:06:23.431090 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 17:06:23 crc kubenswrapper[4694]: I0217 17:06:23.431894 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 17:06:24 crc kubenswrapper[4694]: I0217 17:06:24.957857 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:24 crc kubenswrapper[4694]: I0217 17:06:24.958898 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.023328 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.286200 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.335449 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.515044 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.515339 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.521826 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 17:06:25 crc kubenswrapper[4694]: I0217 17:06:25.525103 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 17:06:25 crc kubenswrapper[4694]: E0217 17:06:25.665647 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:26 crc kubenswrapper[4694]: I0217 17:06:26.251084 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 17:06:26 crc kubenswrapper[4694]: I0217 17:06:26.256918 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.257729 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n445k" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="registry-server" containerID="cri-o://44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb" gracePeriod=2 Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.738060 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.856556 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkwzs\" (UniqueName: \"kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs\") pod \"66856e01-6a23-43ab-9abb-eac237a7a192\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.856984 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content\") pod \"66856e01-6a23-43ab-9abb-eac237a7a192\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.857169 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities\") pod \"66856e01-6a23-43ab-9abb-eac237a7a192\" (UID: \"66856e01-6a23-43ab-9abb-eac237a7a192\") " Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.858449 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities" (OuterVolumeSpecName: "utilities") pod "66856e01-6a23-43ab-9abb-eac237a7a192" (UID: "66856e01-6a23-43ab-9abb-eac237a7a192"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.874161 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs" (OuterVolumeSpecName: "kube-api-access-mkwzs") pod "66856e01-6a23-43ab-9abb-eac237a7a192" (UID: "66856e01-6a23-43ab-9abb-eac237a7a192"). InnerVolumeSpecName "kube-api-access-mkwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.930259 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66856e01-6a23-43ab-9abb-eac237a7a192" (UID: "66856e01-6a23-43ab-9abb-eac237a7a192"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.958925 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkwzs\" (UniqueName: \"kubernetes.io/projected/66856e01-6a23-43ab-9abb-eac237a7a192-kube-api-access-mkwzs\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.958958 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:27 crc kubenswrapper[4694]: I0217 17:06:27.958968 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66856e01-6a23-43ab-9abb-eac237a7a192-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.267910 4694 generic.go:334] "Generic (PLEG): container finished" podID="66856e01-6a23-43ab-9abb-eac237a7a192" containerID="44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb" exitCode=0 Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.267982 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerDied","Data":"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb"} Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.268048 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n445k" event={"ID":"66856e01-6a23-43ab-9abb-eac237a7a192","Type":"ContainerDied","Data":"eb6205315e1af5a1650770f84a098b1d2d5472b24ae125a7fecc0fa67c54b922"} Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.268067 4694 scope.go:117] "RemoveContainer" containerID="44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.268764 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n445k" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.286781 4694 scope.go:117] "RemoveContainer" containerID="b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.308038 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.314789 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n445k"] Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.321028 4694 scope.go:117] "RemoveContainer" containerID="faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.361146 4694 scope.go:117] "RemoveContainer" containerID="44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb" Feb 17 17:06:28 crc kubenswrapper[4694]: E0217 17:06:28.361891 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb\": container with ID starting with 44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb not found: ID does not exist" containerID="44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.361942 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb"} err="failed to get container status \"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb\": rpc error: code = NotFound desc = could not find container \"44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb\": container with ID starting with 44ffdb8684d9d222d061272b5db4cd3d15ebcdfcf6f3493bd1b76d86bdc11fbb not found: ID does not exist" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.361976 4694 scope.go:117] "RemoveContainer" containerID="b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301" Feb 17 17:06:28 crc kubenswrapper[4694]: E0217 17:06:28.362387 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301\": container with ID starting with b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301 not found: ID does not exist" containerID="b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.362482 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301"} err="failed to get container status \"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301\": rpc error: code = NotFound desc = could not find container \"b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301\": container with ID starting with b8ac6bb702ee2e76244c353aaf1779e6d711bcc5f4eba66078cf9f386d689301 not found: ID does not exist" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.362557 4694 scope.go:117] "RemoveContainer" containerID="faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522" Feb 17 17:06:28 crc kubenswrapper[4694]: E0217 17:06:28.363282 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522\": container with ID starting with faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522 not found: ID does not exist" containerID="faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.363312 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522"} err="failed to get container status \"faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522\": rpc error: code = NotFound desc = could not find container \"faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522\": container with ID starting with faca32b06e2e51d773ea6b7d6b341fc65a5c200af8554598dac7f238f8de6522 not found: ID does not exist" Feb 17 17:06:28 crc kubenswrapper[4694]: I0217 17:06:28.910700 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" path="/var/lib/kubelet/pods/66856e01-6a23-43ab-9abb-eac237a7a192/volumes" Feb 17 17:06:33 crc kubenswrapper[4694]: I0217 17:06:33.870646 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:34 crc kubenswrapper[4694]: I0217 17:06:34.691415 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:35 crc kubenswrapper[4694]: E0217 17:06:35.895493 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:37 crc kubenswrapper[4694]: I0217 17:06:37.697169 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="rabbitmq" containerID="cri-o://3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e" gracePeriod=604797 Feb 17 17:06:38 crc kubenswrapper[4694]: I0217 17:06:38.511805 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="rabbitmq" containerID="cri-o://f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9" gracePeriod=604797 Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.270041 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.418739 4694 generic.go:334] "Generic (PLEG): container finished" podID="647b8309-483b-4f58-8360-202bb4b14824" containerID="3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e" exitCode=0 Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.418793 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerDied","Data":"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e"} Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.418812 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.418829 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"647b8309-483b-4f58-8360-202bb4b14824","Type":"ContainerDied","Data":"0bf4341153fbff008ea6884a8b134690cb8e506e1d6358c7fcd1e75f26b63914"} Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.418854 4694 scope.go:117] "RemoveContainer" containerID="3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.426750 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.426881 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.426931 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.426997 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427043 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427071 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plc8h\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427117 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427203 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427773 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427838 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427852 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427942 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.427988 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf\") pod \"647b8309-483b-4f58-8360-202bb4b14824\" (UID: \"647b8309-483b-4f58-8360-202bb4b14824\") " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.428525 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.429284 4694 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.429304 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.429317 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.438875 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info" (OuterVolumeSpecName: "pod-info") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.438878 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h" (OuterVolumeSpecName: "kube-api-access-plc8h") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "kube-api-access-plc8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.445969 4694 scope.go:117] "RemoveContainer" containerID="3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.463149 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.463460 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.463726 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data" (OuterVolumeSpecName: "config-data") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.468271 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.519469 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf" (OuterVolumeSpecName: "server-conf") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532071 4694 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532104 4694 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/647b8309-483b-4f58-8360-202bb4b14824-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532115 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/647b8309-483b-4f58-8360-202bb4b14824-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532125 4694 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/647b8309-483b-4f58-8360-202bb4b14824-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532157 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532168 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.532179 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plc8h\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-kube-api-access-plc8h\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.557186 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.578251 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "647b8309-483b-4f58-8360-202bb4b14824" (UID: "647b8309-483b-4f58-8360-202bb4b14824"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.580083 4694 scope.go:117] "RemoveContainer" containerID="3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.580383 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e\": container with ID starting with 3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e not found: ID does not exist" containerID="3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.580484 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e"} err="failed to get container status \"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e\": rpc error: code = NotFound desc = could not find container \"3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e\": container with ID starting with 3ff8c6d0ee2af9f473291d403240702e3ac332ca0eaec8a873e69df6ce8ee13e not found: ID does not exist" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.580556 4694 scope.go:117] "RemoveContainer" containerID="3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.581078 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053\": container with ID starting with 3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053 not found: ID does not exist" containerID="3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.581118 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053"} err="failed to get container status \"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053\": rpc error: code = NotFound desc = could not find container \"3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053\": container with ID starting with 3578c3965539ff757b6765bb1cfaa6f76be92ad5e87f16274f372d23a73b5053 not found: ID does not exist" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.633713 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/647b8309-483b-4f58-8360-202bb4b14824-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.633822 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.765705 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.789945 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.818882 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.819296 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="setup-container" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819315 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="setup-container" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.819349 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="rabbitmq" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819355 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="rabbitmq" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.819391 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="registry-server" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819405 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="registry-server" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.819422 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="extract-utilities" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819428 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="extract-utilities" Feb 17 17:06:44 crc kubenswrapper[4694]: E0217 17:06:44.819439 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="extract-content" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819444 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="extract-content" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819627 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="66856e01-6a23-43ab-9abb-eac237a7a192" containerName="registry-server" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.819649 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="647b8309-483b-4f58-8360-202bb4b14824" containerName="rabbitmq" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.820678 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.822902 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.822930 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.823037 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.823054 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.823108 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6ddwk" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.823173 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.823290 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.835779 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.914516 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647b8309-483b-4f58-8360-202bb4b14824" path="/var/lib/kubelet/pods/647b8309-483b-4f58-8360-202bb4b14824/volumes" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938118 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938166 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938196 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-config-data\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938358 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938411 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938441 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938565 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2b2g\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-kube-api-access-h2b2g\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938729 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.938779 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.939069 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9698ccc3-769b-43aa-a4bf-f7c95342555a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:44 crc kubenswrapper[4694]: I0217 17:06:44.939110 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9698ccc3-769b-43aa-a4bf-f7c95342555a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.040829 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041372 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041838 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9698ccc3-769b-43aa-a4bf-f7c95342555a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041864 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9698ccc3-769b-43aa-a4bf-f7c95342555a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041884 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041906 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041939 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-config-data\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041955 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041972 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041985 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.042040 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2b2g\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-kube-api-access-h2b2g\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.041309 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.042847 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.043171 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-config-data\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.043175 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.043480 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.043726 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9698ccc3-769b-43aa-a4bf-f7c95342555a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.047672 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.050258 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9698ccc3-769b-43aa-a4bf-f7c95342555a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.063387 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2b2g\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-kube-api-access-h2b2g\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.074432 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9698ccc3-769b-43aa-a4bf-f7c95342555a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.075601 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9698ccc3-769b-43aa-a4bf-f7c95342555a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.101785 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9698ccc3-769b-43aa-a4bf-f7c95342555a\") " pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.144489 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.154084 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.246792 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.246851 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.246877 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.246954 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247020 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247142 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247168 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247183 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247210 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247240 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.247262 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgzrz\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz\") pod \"30f5bee5-cb28-4508-b091-35e85e299afa\" (UID: \"30f5bee5-cb28-4508-b091-35e85e299afa\") " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.250730 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.251662 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.252222 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.252416 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.252876 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz" (OuterVolumeSpecName: "kube-api-access-rgzrz") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "kube-api-access-rgzrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.257177 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.257582 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.257601 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info" (OuterVolumeSpecName: "pod-info") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.281468 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data" (OuterVolumeSpecName: "config-data") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.332159 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf" (OuterVolumeSpecName: "server-conf") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349305 4694 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30f5bee5-cb28-4508-b091-35e85e299afa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349344 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349359 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349375 4694 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349386 4694 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30f5bee5-cb28-4508-b091-35e85e299afa-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349399 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgzrz\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-kube-api-access-rgzrz\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349409 4694 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349438 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349449 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.349460 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30f5bee5-cb28-4508-b091-35e85e299afa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.368033 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.404458 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "30f5bee5-cb28-4508-b091-35e85e299afa" (UID: "30f5bee5-cb28-4508-b091-35e85e299afa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.431504 4694 generic.go:334] "Generic (PLEG): container finished" podID="30f5bee5-cb28-4508-b091-35e85e299afa" containerID="f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9" exitCode=0 Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.431544 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerDied","Data":"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9"} Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.431572 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30f5bee5-cb28-4508-b091-35e85e299afa","Type":"ContainerDied","Data":"6dc3707fedafb2c932e2dc60e99c8c2b30db0073dfe0206f4b2be22b7167ed7c"} Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.431588 4694 scope.go:117] "RemoveContainer" containerID="f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.431937 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.451856 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.451897 4694 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30f5bee5-cb28-4508-b091-35e85e299afa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.477872 4694 scope.go:117] "RemoveContainer" containerID="f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.511200 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.519924 4694 scope.go:117] "RemoveContainer" containerID="f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9" Feb 17 17:06:45 crc kubenswrapper[4694]: E0217 17:06:45.521500 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9\": container with ID starting with f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9 not found: ID does not exist" containerID="f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.521552 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9"} err="failed to get container status \"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9\": rpc error: code = NotFound desc = could not find container \"f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9\": container with ID starting with f83c686aaa6578ad7695b20f32119a80da0aef0ea2645aa2c0b1719e6300d5b9 not found: ID does not exist" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.521581 4694 scope.go:117] "RemoveContainer" containerID="f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.521712 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:45 crc kubenswrapper[4694]: E0217 17:06:45.522120 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e\": container with ID starting with f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e not found: ID does not exist" containerID="f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.522144 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e"} err="failed to get container status \"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e\": rpc error: code = NotFound desc = could not find container \"f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e\": container with ID starting with f52a620174bc36e69c58d59f37f7dee9104d43a6e9ad071d06575742f37a905e not found: ID does not exist" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.530879 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:45 crc kubenswrapper[4694]: E0217 17:06:45.531285 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="setup-container" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.531302 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="setup-container" Feb 17 17:06:45 crc kubenswrapper[4694]: E0217 17:06:45.531314 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="rabbitmq" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.531322 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="rabbitmq" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.531543 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" containerName="rabbitmq" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.532586 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.535488 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.536674 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-k2bk5" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.536891 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.537048 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.537197 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.539104 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.539451 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.548197 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673232 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673326 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673392 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673461 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/181e3039-f77f-47e6-acef-e1dcd93d30f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673598 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673668 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/181e3039-f77f-47e6-acef-e1dcd93d30f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673749 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673795 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcjjv\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-kube-api-access-gcjjv\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.673832 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.674008 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.674085 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.714917 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 17:06:45 crc kubenswrapper[4694]: W0217 17:06:45.716429 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9698ccc3_769b_43aa_a4bf_f7c95342555a.slice/crio-6ba0d0f244d7fadd81ea888f8c7a23465bf8bd7bf10c33b920eecde6740c3ab2 WatchSource:0}: Error finding container 6ba0d0f244d7fadd81ea888f8c7a23465bf8bd7bf10c33b920eecde6740c3ab2: Status 404 returned error can't find the container with id 6ba0d0f244d7fadd81ea888f8c7a23465bf8bd7bf10c33b920eecde6740c3ab2 Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775526 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/181e3039-f77f-47e6-acef-e1dcd93d30f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775622 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775645 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcjjv\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-kube-api-access-gcjjv\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775675 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775728 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775764 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775810 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775840 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775872 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775906 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/181e3039-f77f-47e6-acef-e1dcd93d30f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.775969 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.776881 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.778538 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.779731 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.779911 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/181e3039-f77f-47e6-acef-e1dcd93d30f8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.781632 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.782130 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.788806 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.793815 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.794124 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/181e3039-f77f-47e6-acef-e1dcd93d30f8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.794255 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/181e3039-f77f-47e6-acef-e1dcd93d30f8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.815685 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcjjv\" (UniqueName: \"kubernetes.io/projected/181e3039-f77f-47e6-acef-e1dcd93d30f8-kube-api-access-gcjjv\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:45 crc kubenswrapper[4694]: I0217 17:06:45.853699 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"181e3039-f77f-47e6-acef-e1dcd93d30f8\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.156245 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:06:46 crc kubenswrapper[4694]: E0217 17:06:46.172144 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.441778 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9698ccc3-769b-43aa-a4bf-f7c95342555a","Type":"ContainerStarted","Data":"6ba0d0f244d7fadd81ea888f8c7a23465bf8bd7bf10c33b920eecde6740c3ab2"} Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.523826 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.525728 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.527596 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.545130 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594641 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594699 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594731 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594785 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594811 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594834 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.594882 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xr5m\" (UniqueName: \"kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.606423 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 17:06:46 crc kubenswrapper[4694]: W0217 17:06:46.653691 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181e3039_f77f_47e6_acef_e1dcd93d30f8.slice/crio-a05bedbf03ca5f6022e32e5baa3ab250b2b93dee3a09b003ca07aee71c922753 WatchSource:0}: Error finding container a05bedbf03ca5f6022e32e5baa3ab250b2b93dee3a09b003ca07aee71c922753: Status 404 returned error can't find the container with id a05bedbf03ca5f6022e32e5baa3ab250b2b93dee3a09b003ca07aee71c922753 Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698143 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698220 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698290 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698326 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698350 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698399 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xr5m\" (UniqueName: \"kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.698448 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699106 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699285 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699304 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699455 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699680 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.699849 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.718772 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xr5m\" (UniqueName: \"kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m\") pod \"dnsmasq-dns-5576978c7c-t87dh\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.847779 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:46 crc kubenswrapper[4694]: I0217 17:06:46.909627 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f5bee5-cb28-4508-b091-35e85e299afa" path="/var/lib/kubelet/pods/30f5bee5-cb28-4508-b091-35e85e299afa/volumes" Feb 17 17:06:47 crc kubenswrapper[4694]: I0217 17:06:47.368822 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:06:47 crc kubenswrapper[4694]: W0217 17:06:47.373964 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c355771_b993_48d3_9d23_dfc5016a71cd.slice/crio-722b4232dbad020e210eb9ff940ddedb5e2a6312f1f85eaf9a8012bfbe7a9e6a WatchSource:0}: Error finding container 722b4232dbad020e210eb9ff940ddedb5e2a6312f1f85eaf9a8012bfbe7a9e6a: Status 404 returned error can't find the container with id 722b4232dbad020e210eb9ff940ddedb5e2a6312f1f85eaf9a8012bfbe7a9e6a Feb 17 17:06:47 crc kubenswrapper[4694]: I0217 17:06:47.454509 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9698ccc3-769b-43aa-a4bf-f7c95342555a","Type":"ContainerStarted","Data":"387628d85fe52c1674a28775859ed3fa06d224183c164a7b7d85faf972ae3cc1"} Feb 17 17:06:47 crc kubenswrapper[4694]: I0217 17:06:47.455967 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" event={"ID":"7c355771-b993-48d3-9d23-dfc5016a71cd","Type":"ContainerStarted","Data":"722b4232dbad020e210eb9ff940ddedb5e2a6312f1f85eaf9a8012bfbe7a9e6a"} Feb 17 17:06:47 crc kubenswrapper[4694]: I0217 17:06:47.457591 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"181e3039-f77f-47e6-acef-e1dcd93d30f8","Type":"ContainerStarted","Data":"a05bedbf03ca5f6022e32e5baa3ab250b2b93dee3a09b003ca07aee71c922753"} Feb 17 17:06:48 crc kubenswrapper[4694]: I0217 17:06:48.467068 4694 generic.go:334] "Generic (PLEG): container finished" podID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerID="d3b79598743c3e47cc32b7dd206316a1c7536abb7c385b659915f61478d50e5a" exitCode=0 Feb 17 17:06:48 crc kubenswrapper[4694]: I0217 17:06:48.467255 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" event={"ID":"7c355771-b993-48d3-9d23-dfc5016a71cd","Type":"ContainerDied","Data":"d3b79598743c3e47cc32b7dd206316a1c7536abb7c385b659915f61478d50e5a"} Feb 17 17:06:49 crc kubenswrapper[4694]: I0217 17:06:49.477358 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"181e3039-f77f-47e6-acef-e1dcd93d30f8","Type":"ContainerStarted","Data":"73f3d3d83ff4e6acf4dd7b67dc466e62f372eaf400d04d1c6928ed49d8c4a9f7"} Feb 17 17:06:49 crc kubenswrapper[4694]: I0217 17:06:49.479934 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" event={"ID":"7c355771-b993-48d3-9d23-dfc5016a71cd","Type":"ContainerStarted","Data":"0dac361279b1e6d13e8f0aac63d7d843e2768b14eae14398211cd7ae2c4ab3fd"} Feb 17 17:06:49 crc kubenswrapper[4694]: I0217 17:06:49.526324 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" podStartSLOduration=3.526301954 podStartE2EDuration="3.526301954s" podCreationTimestamp="2026-02-17 17:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:06:49.522942561 +0000 UTC m=+1477.280017895" watchObservedRunningTime="2026-02-17 17:06:49.526301954 +0000 UTC m=+1477.283377278" Feb 17 17:06:50 crc kubenswrapper[4694]: I0217 17:06:50.492664 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:56 crc kubenswrapper[4694]: E0217 17:06:56.408114 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache]" Feb 17 17:06:56 crc kubenswrapper[4694]: I0217 17:06:56.849539 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:06:56 crc kubenswrapper[4694]: I0217 17:06:56.944309 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:06:56 crc kubenswrapper[4694]: I0217 17:06:56.945063 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="dnsmasq-dns" containerID="cri-o://dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d" gracePeriod=10 Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.043546 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-l2p9l"] Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.045514 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.084894 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-l2p9l"] Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120648 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120768 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120844 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-config\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120868 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120895 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhpr\" (UniqueName: \"kubernetes.io/projected/c65ebfa5-bbb3-4011-8593-8cfbd2765254-kube-api-access-qhhpr\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120921 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.120954 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223165 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-config\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223208 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223240 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhpr\" (UniqueName: \"kubernetes.io/projected/c65ebfa5-bbb3-4011-8593-8cfbd2765254-kube-api-access-qhhpr\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223272 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223313 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223390 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.223691 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.225188 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.226697 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-config\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.227340 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.228507 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.229328 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.229664 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65ebfa5-bbb3-4011-8593-8cfbd2765254-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.262473 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhpr\" (UniqueName: \"kubernetes.io/projected/c65ebfa5-bbb3-4011-8593-8cfbd2765254-kube-api-access-qhhpr\") pod \"dnsmasq-dns-8c6f6df99-l2p9l\" (UID: \"c65ebfa5-bbb3-4011-8593-8cfbd2765254\") " pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.415286 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.480492 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.583374 4694 generic.go:334] "Generic (PLEG): container finished" podID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerID="dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d" exitCode=0 Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.583700 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" event={"ID":"f079a4f8-0663-4400-a495-b684a3cf7ef9","Type":"ContainerDied","Data":"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d"} Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.583727 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" event={"ID":"f079a4f8-0663-4400-a495-b684a3cf7ef9","Type":"ContainerDied","Data":"5c54de90259f959cf1bfa3dcc4926347ebdfa824a71e257645be9abba1925173"} Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.583813 4694 scope.go:117] "RemoveContainer" containerID="dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.583954 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8qv52" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.610619 4694 scope.go:117] "RemoveContainer" containerID="ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.637061 4694 scope.go:117] "RemoveContainer" containerID="dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d" Feb 17 17:06:57 crc kubenswrapper[4694]: E0217 17:06:57.637807 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d\": container with ID starting with dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d not found: ID does not exist" containerID="dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.637848 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d"} err="failed to get container status \"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d\": rpc error: code = NotFound desc = could not find container \"dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d\": container with ID starting with dcf731288ca6791f6ae49519d69a4cd4fbab3660c72e625a8b3c39e3f680453d not found: ID does not exist" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.637898 4694 scope.go:117] "RemoveContainer" containerID="ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb" Feb 17 17:06:57 crc kubenswrapper[4694]: E0217 17:06:57.638389 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb\": container with ID starting with ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb not found: ID does not exist" containerID="ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.638442 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb"} err="failed to get container status \"ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb\": rpc error: code = NotFound desc = could not find container \"ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb\": container with ID starting with ef34c7ced34aa97c8b415e71957bcd7c2ca8f2b036cc4569446fda7b7662feeb not found: ID does not exist" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640235 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640337 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640413 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrh6n\" (UniqueName: \"kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640451 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640469 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.640520 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc\") pod \"f079a4f8-0663-4400-a495-b684a3cf7ef9\" (UID: \"f079a4f8-0663-4400-a495-b684a3cf7ef9\") " Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.646484 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n" (OuterVolumeSpecName: "kube-api-access-zrh6n") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "kube-api-access-zrh6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.694993 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.696031 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.697485 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config" (OuterVolumeSpecName: "config") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.698926 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.709271 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f079a4f8-0663-4400-a495-b684a3cf7ef9" (UID: "f079a4f8-0663-4400-a495-b684a3cf7ef9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743451 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743493 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743504 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrh6n\" (UniqueName: \"kubernetes.io/projected/f079a4f8-0663-4400-a495-b684a3cf7ef9-kube-api-access-zrh6n\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743516 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743528 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.743537 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f079a4f8-0663-4400-a495-b684a3cf7ef9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.904466 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-l2p9l"] Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.919186 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:06:57 crc kubenswrapper[4694]: I0217 17:06:57.928512 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8qv52"] Feb 17 17:06:58 crc kubenswrapper[4694]: I0217 17:06:58.592268 4694 generic.go:334] "Generic (PLEG): container finished" podID="c65ebfa5-bbb3-4011-8593-8cfbd2765254" containerID="116a6f62ec2fa9b5e73b085c6cf2ed7e2362607c28daecaec37e60add458b451" exitCode=0 Feb 17 17:06:58 crc kubenswrapper[4694]: I0217 17:06:58.592316 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" event={"ID":"c65ebfa5-bbb3-4011-8593-8cfbd2765254","Type":"ContainerDied","Data":"116a6f62ec2fa9b5e73b085c6cf2ed7e2362607c28daecaec37e60add458b451"} Feb 17 17:06:58 crc kubenswrapper[4694]: I0217 17:06:58.592651 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" event={"ID":"c65ebfa5-bbb3-4011-8593-8cfbd2765254","Type":"ContainerStarted","Data":"b69c7b76add87439ab63d62a6dfc8c18ee9b4a4eb900655b0073cb223ae968d6"} Feb 17 17:06:58 crc kubenswrapper[4694]: I0217 17:06:58.905181 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" path="/var/lib/kubelet/pods/f079a4f8-0663-4400-a495-b684a3cf7ef9/volumes" Feb 17 17:06:59 crc kubenswrapper[4694]: I0217 17:06:59.603554 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" event={"ID":"c65ebfa5-bbb3-4011-8593-8cfbd2765254","Type":"ContainerStarted","Data":"d9d9bef6164ab494cc3c46c412091fede38b2fbaa184ed65689160c8dc000ea6"} Feb 17 17:06:59 crc kubenswrapper[4694]: I0217 17:06:59.604016 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:06:59 crc kubenswrapper[4694]: I0217 17:06:59.628264 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" podStartSLOduration=2.628248645 podStartE2EDuration="2.628248645s" podCreationTimestamp="2026-02-17 17:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:06:59.627841125 +0000 UTC m=+1487.384916449" watchObservedRunningTime="2026-02-17 17:06:59.628248645 +0000 UTC m=+1487.385323969" Feb 17 17:07:06 crc kubenswrapper[4694]: E0217 17:07:06.661715 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24261cc7_a023_431d_bc69_8d8009b41a03.slice/crio-09f01ef73567b967c31fc2fae0dcf68a5dd9006f3850471afa71dc8e699690c2\": RecentStats: unable to find data in memory cache]" Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.416777 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-l2p9l" Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.507372 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.508039 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="dnsmasq-dns" containerID="cri-o://0dac361279b1e6d13e8f0aac63d7d843e2768b14eae14398211cd7ae2c4ab3fd" gracePeriod=10 Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.681042 4694 generic.go:334] "Generic (PLEG): container finished" podID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerID="0dac361279b1e6d13e8f0aac63d7d843e2768b14eae14398211cd7ae2c4ab3fd" exitCode=0 Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.681157 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" event={"ID":"7c355771-b993-48d3-9d23-dfc5016a71cd","Type":"ContainerDied","Data":"0dac361279b1e6d13e8f0aac63d7d843e2768b14eae14398211cd7ae2c4ab3fd"} Feb 17 17:07:07 crc kubenswrapper[4694]: I0217 17:07:07.966121 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.042473 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.042600 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.042680 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xr5m\" (UniqueName: \"kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.042720 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.042906 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.043053 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.043075 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb\") pod \"7c355771-b993-48d3-9d23-dfc5016a71cd\" (UID: \"7c355771-b993-48d3-9d23-dfc5016a71cd\") " Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.067039 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m" (OuterVolumeSpecName: "kube-api-access-8xr5m") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "kube-api-access-8xr5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.093557 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.096201 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.096960 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.097439 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config" (OuterVolumeSpecName: "config") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.099625 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.125100 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c355771-b993-48d3-9d23-dfc5016a71cd" (UID: "7c355771-b993-48d3-9d23-dfc5016a71cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145832 4694 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145869 4694 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145879 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xr5m\" (UniqueName: \"kubernetes.io/projected/7c355771-b993-48d3-9d23-dfc5016a71cd-kube-api-access-8xr5m\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145889 4694 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145898 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145906 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.145917 4694 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c355771-b993-48d3-9d23-dfc5016a71cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.689999 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" event={"ID":"7c355771-b993-48d3-9d23-dfc5016a71cd","Type":"ContainerDied","Data":"722b4232dbad020e210eb9ff940ddedb5e2a6312f1f85eaf9a8012bfbe7a9e6a"} Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.690090 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-t87dh" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.690311 4694 scope.go:117] "RemoveContainer" containerID="0dac361279b1e6d13e8f0aac63d7d843e2768b14eae14398211cd7ae2c4ab3fd" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.707906 4694 scope.go:117] "RemoveContainer" containerID="d3b79598743c3e47cc32b7dd206316a1c7536abb7c385b659915f61478d50e5a" Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.729412 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.742928 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-t87dh"] Feb 17 17:07:08 crc kubenswrapper[4694]: I0217 17:07:08.905427 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" path="/var/lib/kubelet/pods/7c355771-b993-48d3-9d23-dfc5016a71cd/volumes" Feb 17 17:07:14 crc kubenswrapper[4694]: I0217 17:07:14.617810 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:07:14 crc kubenswrapper[4694]: I0217 17:07:14.618431 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:07:19 crc kubenswrapper[4694]: I0217 17:07:19.796152 4694 generic.go:334] "Generic (PLEG): container finished" podID="9698ccc3-769b-43aa-a4bf-f7c95342555a" containerID="387628d85fe52c1674a28775859ed3fa06d224183c164a7b7d85faf972ae3cc1" exitCode=0 Feb 17 17:07:19 crc kubenswrapper[4694]: I0217 17:07:19.796371 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9698ccc3-769b-43aa-a4bf-f7c95342555a","Type":"ContainerDied","Data":"387628d85fe52c1674a28775859ed3fa06d224183c164a7b7d85faf972ae3cc1"} Feb 17 17:07:20 crc kubenswrapper[4694]: I0217 17:07:20.805809 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9698ccc3-769b-43aa-a4bf-f7c95342555a","Type":"ContainerStarted","Data":"975f134768f95cb63579d05b0a9b5c4396ce38e70c4ae0425cbe47c50a0789b0"} Feb 17 17:07:20 crc kubenswrapper[4694]: I0217 17:07:20.807537 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 17:07:21 crc kubenswrapper[4694]: I0217 17:07:21.814347 4694 generic.go:334] "Generic (PLEG): container finished" podID="181e3039-f77f-47e6-acef-e1dcd93d30f8" containerID="73f3d3d83ff4e6acf4dd7b67dc466e62f372eaf400d04d1c6928ed49d8c4a9f7" exitCode=0 Feb 17 17:07:21 crc kubenswrapper[4694]: I0217 17:07:21.814447 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"181e3039-f77f-47e6-acef-e1dcd93d30f8","Type":"ContainerDied","Data":"73f3d3d83ff4e6acf4dd7b67dc466e62f372eaf400d04d1c6928ed49d8c4a9f7"} Feb 17 17:07:21 crc kubenswrapper[4694]: I0217 17:07:21.848827 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.848806435 podStartE2EDuration="37.848806435s" podCreationTimestamp="2026-02-17 17:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:07:20.835876382 +0000 UTC m=+1508.592951706" watchObservedRunningTime="2026-02-17 17:07:21.848806435 +0000 UTC m=+1509.605881759" Feb 17 17:07:22 crc kubenswrapper[4694]: I0217 17:07:22.825023 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"181e3039-f77f-47e6-acef-e1dcd93d30f8","Type":"ContainerStarted","Data":"eaab64c05bcdb1e24f4f1ccdf1e5b7822445d94bf0d4d8748753f896c111873f"} Feb 17 17:07:22 crc kubenswrapper[4694]: I0217 17:07:22.825327 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:07:22 crc kubenswrapper[4694]: I0217 17:07:22.857550 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.857526744 podStartE2EDuration="37.857526744s" podCreationTimestamp="2026-02-17 17:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:07:22.84600221 +0000 UTC m=+1510.603077534" watchObservedRunningTime="2026-02-17 17:07:22.857526744 +0000 UTC m=+1510.614602068" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.373245 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj"] Feb 17 17:07:24 crc kubenswrapper[4694]: E0217 17:07:24.373928 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="init" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.373949 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="init" Feb 17 17:07:24 crc kubenswrapper[4694]: E0217 17:07:24.373975 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.373983 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: E0217 17:07:24.374005 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.374013 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: E0217 17:07:24.374041 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="init" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.374054 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="init" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.374322 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f079a4f8-0663-4400-a495-b684a3cf7ef9" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.374346 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c355771-b993-48d3-9d23-dfc5016a71cd" containerName="dnsmasq-dns" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.375414 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.378404 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.378716 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.378963 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.381013 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.385949 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj"] Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.481730 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.482436 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.482559 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm8zw\" (UniqueName: \"kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.482660 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.585343 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.585485 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.585601 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm8zw\" (UniqueName: \"kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.586944 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.596089 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.596216 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.596292 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.614524 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm8zw\" (UniqueName: \"kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:24 crc kubenswrapper[4694]: I0217 17:07:24.695480 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:25 crc kubenswrapper[4694]: W0217 17:07:25.371139 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef784368_4cf0_42fb_b4c5_b5ca19fe472a.slice/crio-2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f WatchSource:0}: Error finding container 2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f: Status 404 returned error can't find the container with id 2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f Feb 17 17:07:25 crc kubenswrapper[4694]: I0217 17:07:25.379868 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj"] Feb 17 17:07:25 crc kubenswrapper[4694]: I0217 17:07:25.852150 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" event={"ID":"ef784368-4cf0-42fb-b4c5-b5ca19fe472a","Type":"ContainerStarted","Data":"2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f"} Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.253645 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.256426 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.286837 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.295858 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmslz\" (UniqueName: \"kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.295914 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.295938 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.398009 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmslz\" (UniqueName: \"kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.398081 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.398118 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.398583 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.399201 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.423140 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmslz\" (UniqueName: \"kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz\") pod \"certified-operators-86xxg\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:34 crc kubenswrapper[4694]: I0217 17:07:34.585579 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:35 crc kubenswrapper[4694]: I0217 17:07:35.156862 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 17:07:36 crc kubenswrapper[4694]: I0217 17:07:36.159912 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 17:07:39 crc kubenswrapper[4694]: W0217 17:07:39.530211 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a6ee22_e159_46fb_a182_5490b7f73c99.slice/crio-55628595f66b142510877782e478745a7f4aff2e827db8f8ddf0072f19154a92 WatchSource:0}: Error finding container 55628595f66b142510877782e478745a7f4aff2e827db8f8ddf0072f19154a92: Status 404 returned error can't find the container with id 55628595f66b142510877782e478745a7f4aff2e827db8f8ddf0072f19154a92 Feb 17 17:07:39 crc kubenswrapper[4694]: I0217 17:07:39.530984 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:40 crc kubenswrapper[4694]: I0217 17:07:40.037565 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" event={"ID":"ef784368-4cf0-42fb-b4c5-b5ca19fe472a","Type":"ContainerStarted","Data":"c0ceeb74cd9c8653c59b671b6ebee213bb1b85a0dd507332d819772f69a8b37b"} Feb 17 17:07:40 crc kubenswrapper[4694]: I0217 17:07:40.040865 4694 generic.go:334] "Generic (PLEG): container finished" podID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerID="2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6" exitCode=0 Feb 17 17:07:40 crc kubenswrapper[4694]: I0217 17:07:40.040911 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerDied","Data":"2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6"} Feb 17 17:07:40 crc kubenswrapper[4694]: I0217 17:07:40.040933 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerStarted","Data":"55628595f66b142510877782e478745a7f4aff2e827db8f8ddf0072f19154a92"} Feb 17 17:07:40 crc kubenswrapper[4694]: I0217 17:07:40.062157 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" podStartSLOduration=2.2574793 podStartE2EDuration="16.062133053s" podCreationTimestamp="2026-02-17 17:07:24 +0000 UTC" firstStartedPulling="2026-02-17 17:07:25.374080291 +0000 UTC m=+1513.131155615" lastFinishedPulling="2026-02-17 17:07:39.178734054 +0000 UTC m=+1526.935809368" observedRunningTime="2026-02-17 17:07:40.059104648 +0000 UTC m=+1527.816179982" watchObservedRunningTime="2026-02-17 17:07:40.062133053 +0000 UTC m=+1527.819208387" Feb 17 17:07:42 crc kubenswrapper[4694]: I0217 17:07:42.063256 4694 generic.go:334] "Generic (PLEG): container finished" podID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerID="31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f" exitCode=0 Feb 17 17:07:42 crc kubenswrapper[4694]: I0217 17:07:42.063309 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerDied","Data":"31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f"} Feb 17 17:07:43 crc kubenswrapper[4694]: I0217 17:07:43.078714 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerStarted","Data":"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a"} Feb 17 17:07:43 crc kubenswrapper[4694]: I0217 17:07:43.103721 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-86xxg" podStartSLOduration=6.6615844079999995 podStartE2EDuration="9.103704529s" podCreationTimestamp="2026-02-17 17:07:34 +0000 UTC" firstStartedPulling="2026-02-17 17:07:40.042429387 +0000 UTC m=+1527.799504711" lastFinishedPulling="2026-02-17 17:07:42.484549508 +0000 UTC m=+1530.241624832" observedRunningTime="2026-02-17 17:07:43.09606541 +0000 UTC m=+1530.853140734" watchObservedRunningTime="2026-02-17 17:07:43.103704529 +0000 UTC m=+1530.860779853" Feb 17 17:07:44 crc kubenswrapper[4694]: I0217 17:07:44.586666 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:44 crc kubenswrapper[4694]: I0217 17:07:44.586992 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:44 crc kubenswrapper[4694]: I0217 17:07:44.617899 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:07:44 crc kubenswrapper[4694]: I0217 17:07:44.617966 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:07:44 crc kubenswrapper[4694]: I0217 17:07:44.630918 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:54 crc kubenswrapper[4694]: I0217 17:07:54.639864 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:54 crc kubenswrapper[4694]: I0217 17:07:54.692060 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:55 crc kubenswrapper[4694]: I0217 17:07:55.185020 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-86xxg" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="registry-server" containerID="cri-o://b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a" gracePeriod=2 Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.167864 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.208446 4694 generic.go:334] "Generic (PLEG): container finished" podID="ef784368-4cf0-42fb-b4c5-b5ca19fe472a" containerID="c0ceeb74cd9c8653c59b671b6ebee213bb1b85a0dd507332d819772f69a8b37b" exitCode=0 Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.208575 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" event={"ID":"ef784368-4cf0-42fb-b4c5-b5ca19fe472a","Type":"ContainerDied","Data":"c0ceeb74cd9c8653c59b671b6ebee213bb1b85a0dd507332d819772f69a8b37b"} Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.213931 4694 generic.go:334] "Generic (PLEG): container finished" podID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerID="b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a" exitCode=0 Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.213985 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerDied","Data":"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a"} Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.214017 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86xxg" event={"ID":"09a6ee22-e159-46fb-a182-5490b7f73c99","Type":"ContainerDied","Data":"55628595f66b142510877782e478745a7f4aff2e827db8f8ddf0072f19154a92"} Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.214037 4694 scope.go:117] "RemoveContainer" containerID="b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.214203 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86xxg" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.238877 4694 scope.go:117] "RemoveContainer" containerID="31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.258077 4694 scope.go:117] "RemoveContainer" containerID="2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.300545 4694 scope.go:117] "RemoveContainer" containerID="b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a" Feb 17 17:07:56 crc kubenswrapper[4694]: E0217 17:07:56.301094 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a\": container with ID starting with b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a not found: ID does not exist" containerID="b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.301134 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a"} err="failed to get container status \"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a\": rpc error: code = NotFound desc = could not find container \"b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a\": container with ID starting with b755b5aca5eff11cf1528bfdccfe92d9678b93f962ea209181ffa98bdfa95a1a not found: ID does not exist" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.301160 4694 scope.go:117] "RemoveContainer" containerID="31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f" Feb 17 17:07:56 crc kubenswrapper[4694]: E0217 17:07:56.301493 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f\": container with ID starting with 31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f not found: ID does not exist" containerID="31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.301531 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f"} err="failed to get container status \"31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f\": rpc error: code = NotFound desc = could not find container \"31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f\": container with ID starting with 31c69c3c665a478841e127f82fa5221e9fe5ffe7327e7fabbcab1b78f40a708f not found: ID does not exist" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.301560 4694 scope.go:117] "RemoveContainer" containerID="2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6" Feb 17 17:07:56 crc kubenswrapper[4694]: E0217 17:07:56.301887 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6\": container with ID starting with 2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6 not found: ID does not exist" containerID="2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.301917 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6"} err="failed to get container status \"2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6\": rpc error: code = NotFound desc = could not find container \"2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6\": container with ID starting with 2839863c38aa7be0d6a82a8fbeeb84e43b6d38f22672ccb1e90fffed725d79d6 not found: ID does not exist" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.313793 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities\") pod \"09a6ee22-e159-46fb-a182-5490b7f73c99\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.313979 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content\") pod \"09a6ee22-e159-46fb-a182-5490b7f73c99\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.314145 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmslz\" (UniqueName: \"kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz\") pod \"09a6ee22-e159-46fb-a182-5490b7f73c99\" (UID: \"09a6ee22-e159-46fb-a182-5490b7f73c99\") " Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.314816 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities" (OuterVolumeSpecName: "utilities") pod "09a6ee22-e159-46fb-a182-5490b7f73c99" (UID: "09a6ee22-e159-46fb-a182-5490b7f73c99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.319595 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz" (OuterVolumeSpecName: "kube-api-access-lmslz") pod "09a6ee22-e159-46fb-a182-5490b7f73c99" (UID: "09a6ee22-e159-46fb-a182-5490b7f73c99"). InnerVolumeSpecName "kube-api-access-lmslz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.366426 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09a6ee22-e159-46fb-a182-5490b7f73c99" (UID: "09a6ee22-e159-46fb-a182-5490b7f73c99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.416096 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.416144 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmslz\" (UniqueName: \"kubernetes.io/projected/09a6ee22-e159-46fb-a182-5490b7f73c99-kube-api-access-lmslz\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.416156 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a6ee22-e159-46fb-a182-5490b7f73c99-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.550630 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.561022 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-86xxg"] Feb 17 17:07:56 crc kubenswrapper[4694]: I0217 17:07:56.906142 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" path="/var/lib/kubelet/pods/09a6ee22-e159-46fb-a182-5490b7f73c99/volumes" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.697547 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.843952 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm8zw\" (UniqueName: \"kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw\") pod \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.844102 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle\") pod \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.844192 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") pod \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.844477 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam\") pod \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.849735 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw" (OuterVolumeSpecName: "kube-api-access-zm8zw") pod "ef784368-4cf0-42fb-b4c5-b5ca19fe472a" (UID: "ef784368-4cf0-42fb-b4c5-b5ca19fe472a"). InnerVolumeSpecName "kube-api-access-zm8zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.869707 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ef784368-4cf0-42fb-b4c5-b5ca19fe472a" (UID: "ef784368-4cf0-42fb-b4c5-b5ca19fe472a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:07:57 crc kubenswrapper[4694]: E0217 17:07:57.879308 4694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory podName:ef784368-4cf0-42fb-b4c5-b5ca19fe472a nodeName:}" failed. No retries permitted until 2026-02-17 17:07:58.379274667 +0000 UTC m=+1546.136350001 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory") pod "ef784368-4cf0-42fb-b4c5-b5ca19fe472a" (UID: "ef784368-4cf0-42fb-b4c5-b5ca19fe472a") : error deleting /var/lib/kubelet/pods/ef784368-4cf0-42fb-b4c5-b5ca19fe472a/volume-subpaths: remove /var/lib/kubelet/pods/ef784368-4cf0-42fb-b4c5-b5ca19fe472a/volume-subpaths: no such file or directory Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.881842 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ef784368-4cf0-42fb-b4c5-b5ca19fe472a" (UID: "ef784368-4cf0-42fb-b4c5-b5ca19fe472a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.946593 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.946649 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm8zw\" (UniqueName: \"kubernetes.io/projected/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-kube-api-access-zm8zw\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:57 crc kubenswrapper[4694]: I0217 17:07:57.946662 4694 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.235681 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" event={"ID":"ef784368-4cf0-42fb-b4c5-b5ca19fe472a","Type":"ContainerDied","Data":"2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f"} Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.235734 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b3d93d60d5f4cad962ca5e7f5380fdcfa170f19766e2680804eeb7975707b0f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.235736 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.381065 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f"] Feb 17 17:07:58 crc kubenswrapper[4694]: E0217 17:07:58.428871 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="extract-utilities" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.428912 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="extract-utilities" Feb 17 17:07:58 crc kubenswrapper[4694]: E0217 17:07:58.428934 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef784368-4cf0-42fb-b4c5-b5ca19fe472a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.428942 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef784368-4cf0-42fb-b4c5-b5ca19fe472a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 17:07:58 crc kubenswrapper[4694]: E0217 17:07:58.428959 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="registry-server" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.428969 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="registry-server" Feb 17 17:07:58 crc kubenswrapper[4694]: E0217 17:07:58.429018 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="extract-content" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.429023 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="extract-content" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.429567 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef784368-4cf0-42fb-b4c5-b5ca19fe472a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.429618 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a6ee22-e159-46fb-a182-5490b7f73c99" containerName="registry-server" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.430583 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.451745 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f"] Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.453967 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") pod \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\" (UID: \"ef784368-4cf0-42fb-b4c5-b5ca19fe472a\") " Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.490886 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory" (OuterVolumeSpecName: "inventory") pod "ef784368-4cf0-42fb-b4c5-b5ca19fe472a" (UID: "ef784368-4cf0-42fb-b4c5-b5ca19fe472a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.556880 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.556997 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.557032 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtn8\" (UniqueName: \"kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.557087 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef784368-4cf0-42fb-b4c5-b5ca19fe472a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.658540 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.658700 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.658744 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtn8\" (UniqueName: \"kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.663329 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.664941 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.675790 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtn8\" (UniqueName: \"kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-d487f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:58 crc kubenswrapper[4694]: I0217 17:07:58.823071 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:07:59 crc kubenswrapper[4694]: I0217 17:07:59.357363 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f"] Feb 17 17:08:00 crc kubenswrapper[4694]: I0217 17:08:00.257394 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" event={"ID":"45da847c-7705-408d-a3c4-b05253e15d3f","Type":"ContainerStarted","Data":"297b1dea44eaae693cd051f7d5d534ec2258227037c1dc1ded85c57b63bc61eb"} Feb 17 17:08:00 crc kubenswrapper[4694]: I0217 17:08:00.257797 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" event={"ID":"45da847c-7705-408d-a3c4-b05253e15d3f","Type":"ContainerStarted","Data":"3c1ed8f19e2bc7cdbafbba426ead62140ae7607c84b43dc7482971ef62a5853e"} Feb 17 17:08:00 crc kubenswrapper[4694]: I0217 17:08:00.278057 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" podStartSLOduration=2.08304316 podStartE2EDuration="2.278031779s" podCreationTimestamp="2026-02-17 17:07:58 +0000 UTC" firstStartedPulling="2026-02-17 17:07:59.357281461 +0000 UTC m=+1547.114356785" lastFinishedPulling="2026-02-17 17:07:59.55227009 +0000 UTC m=+1547.309345404" observedRunningTime="2026-02-17 17:08:00.277025394 +0000 UTC m=+1548.034100718" watchObservedRunningTime="2026-02-17 17:08:00.278031779 +0000 UTC m=+1548.035107123" Feb 17 17:08:02 crc kubenswrapper[4694]: I0217 17:08:02.275404 4694 generic.go:334] "Generic (PLEG): container finished" podID="45da847c-7705-408d-a3c4-b05253e15d3f" containerID="297b1dea44eaae693cd051f7d5d534ec2258227037c1dc1ded85c57b63bc61eb" exitCode=0 Feb 17 17:08:02 crc kubenswrapper[4694]: I0217 17:08:02.275499 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" event={"ID":"45da847c-7705-408d-a3c4-b05253e15d3f","Type":"ContainerDied","Data":"297b1dea44eaae693cd051f7d5d534ec2258227037c1dc1ded85c57b63bc61eb"} Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.732285 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.790803 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam\") pod \"45da847c-7705-408d-a3c4-b05253e15d3f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.790872 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory\") pod \"45da847c-7705-408d-a3c4-b05253e15d3f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.790967 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtn8\" (UniqueName: \"kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8\") pod \"45da847c-7705-408d-a3c4-b05253e15d3f\" (UID: \"45da847c-7705-408d-a3c4-b05253e15d3f\") " Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.797829 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8" (OuterVolumeSpecName: "kube-api-access-wmtn8") pod "45da847c-7705-408d-a3c4-b05253e15d3f" (UID: "45da847c-7705-408d-a3c4-b05253e15d3f"). InnerVolumeSpecName "kube-api-access-wmtn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.821426 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory" (OuterVolumeSpecName: "inventory") pod "45da847c-7705-408d-a3c4-b05253e15d3f" (UID: "45da847c-7705-408d-a3c4-b05253e15d3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.830166 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "45da847c-7705-408d-a3c4-b05253e15d3f" (UID: "45da847c-7705-408d-a3c4-b05253e15d3f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.892720 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtn8\" (UniqueName: \"kubernetes.io/projected/45da847c-7705-408d-a3c4-b05253e15d3f-kube-api-access-wmtn8\") on node \"crc\" DevicePath \"\"" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.892749 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:08:03 crc kubenswrapper[4694]: I0217 17:08:03.892760 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45da847c-7705-408d-a3c4-b05253e15d3f-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.295124 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" event={"ID":"45da847c-7705-408d-a3c4-b05253e15d3f","Type":"ContainerDied","Data":"3c1ed8f19e2bc7cdbafbba426ead62140ae7607c84b43dc7482971ef62a5853e"} Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.295427 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1ed8f19e2bc7cdbafbba426ead62140ae7607c84b43dc7482971ef62a5853e" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.295474 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-d487f" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.361623 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5"] Feb 17 17:08:04 crc kubenswrapper[4694]: E0217 17:08:04.362082 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45da847c-7705-408d-a3c4-b05253e15d3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.362107 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="45da847c-7705-408d-a3c4-b05253e15d3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.362353 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="45da847c-7705-408d-a3c4-b05253e15d3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.363105 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.365502 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.365545 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.365830 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.368086 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.376198 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5"] Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.402258 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.402323 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.402454 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbx6c\" (UniqueName: \"kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.402484 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.504567 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbx6c\" (UniqueName: \"kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.504637 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.504741 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.504764 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.509164 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.514094 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.514325 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.521543 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbx6c\" (UniqueName: \"kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:04 crc kubenswrapper[4694]: I0217 17:08:04.679190 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:08:05 crc kubenswrapper[4694]: I0217 17:08:05.216470 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5"] Feb 17 17:08:05 crc kubenswrapper[4694]: I0217 17:08:05.304770 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" event={"ID":"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d","Type":"ContainerStarted","Data":"484a9051c5c831f53a01d048303156b97fb96bbb585eee8ee7e0953b80926d74"} Feb 17 17:08:06 crc kubenswrapper[4694]: I0217 17:08:06.336438 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" event={"ID":"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d","Type":"ContainerStarted","Data":"96b65642ada1f4c2617c65900dbc02e563897f984fe8671e368a48c02f8295f7"} Feb 17 17:08:06 crc kubenswrapper[4694]: I0217 17:08:06.371447 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" podStartSLOduration=2.195595708 podStartE2EDuration="2.371411665s" podCreationTimestamp="2026-02-17 17:08:04 +0000 UTC" firstStartedPulling="2026-02-17 17:08:05.219093944 +0000 UTC m=+1552.976169268" lastFinishedPulling="2026-02-17 17:08:05.394909861 +0000 UTC m=+1553.151985225" observedRunningTime="2026-02-17 17:08:06.362444724 +0000 UTC m=+1554.119520058" watchObservedRunningTime="2026-02-17 17:08:06.371411665 +0000 UTC m=+1554.128487029" Feb 17 17:08:14 crc kubenswrapper[4694]: I0217 17:08:14.618178 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:08:14 crc kubenswrapper[4694]: I0217 17:08:14.618667 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:08:14 crc kubenswrapper[4694]: I0217 17:08:14.618713 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:08:14 crc kubenswrapper[4694]: I0217 17:08:14.619480 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:08:14 crc kubenswrapper[4694]: I0217 17:08:14.619535 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" gracePeriod=600 Feb 17 17:08:14 crc kubenswrapper[4694]: E0217 17:08:14.748049 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:08:15 crc kubenswrapper[4694]: I0217 17:08:15.429195 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" exitCode=0 Feb 17 17:08:15 crc kubenswrapper[4694]: I0217 17:08:15.429261 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f"} Feb 17 17:08:15 crc kubenswrapper[4694]: I0217 17:08:15.429318 4694 scope.go:117] "RemoveContainer" containerID="4d6c34eac314e32bb2a700fd2365f6cc5994e5e6d675cca523ef76c638d044d6" Feb 17 17:08:15 crc kubenswrapper[4694]: I0217 17:08:15.429904 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:08:15 crc kubenswrapper[4694]: E0217 17:08:15.430167 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:08:24 crc kubenswrapper[4694]: I0217 17:08:24.473396 4694 scope.go:117] "RemoveContainer" containerID="08807e28790f483219c90218f1222e272851656dca7c0fc037261c614e606669" Feb 17 17:08:30 crc kubenswrapper[4694]: I0217 17:08:30.895899 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:08:30 crc kubenswrapper[4694]: E0217 17:08:30.896584 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:08:45 crc kubenswrapper[4694]: I0217 17:08:45.895494 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:08:45 crc kubenswrapper[4694]: E0217 17:08:45.896457 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:08:58 crc kubenswrapper[4694]: I0217 17:08:58.894923 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:08:58 crc kubenswrapper[4694]: E0217 17:08:58.895552 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:09:12 crc kubenswrapper[4694]: I0217 17:09:12.895337 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:09:12 crc kubenswrapper[4694]: E0217 17:09:12.896128 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:09:24 crc kubenswrapper[4694]: I0217 17:09:24.558245 4694 scope.go:117] "RemoveContainer" containerID="6ab0ff5e18e690db7ef239db82f91bc6913210fb326bcf714effdd8d616ad7b4" Feb 17 17:09:24 crc kubenswrapper[4694]: I0217 17:09:24.594988 4694 scope.go:117] "RemoveContainer" containerID="d4f43c456a260da5382abe58881b26520dcdec039fdd492e8b142758c6c5c7ab" Feb 17 17:09:24 crc kubenswrapper[4694]: I0217 17:09:24.666096 4694 scope.go:117] "RemoveContainer" containerID="79457e6ee360e960dec5f3b489c05458b2915f47c1f9389813490d2a5c2da2f3" Feb 17 17:09:26 crc kubenswrapper[4694]: I0217 17:09:26.896750 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:09:26 crc kubenswrapper[4694]: E0217 17:09:26.898287 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:09:38 crc kubenswrapper[4694]: I0217 17:09:38.896408 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:09:38 crc kubenswrapper[4694]: E0217 17:09:38.897276 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:09:52 crc kubenswrapper[4694]: I0217 17:09:52.904570 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:09:52 crc kubenswrapper[4694]: E0217 17:09:52.905414 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:10:06 crc kubenswrapper[4694]: I0217 17:10:06.896255 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:10:06 crc kubenswrapper[4694]: E0217 17:10:06.897005 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:10:19 crc kubenswrapper[4694]: I0217 17:10:19.895752 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:10:19 crc kubenswrapper[4694]: E0217 17:10:19.897027 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:10:30 crc kubenswrapper[4694]: I0217 17:10:30.896101 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:10:30 crc kubenswrapper[4694]: E0217 17:10:30.897046 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:10:44 crc kubenswrapper[4694]: I0217 17:10:44.895762 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:10:44 crc kubenswrapper[4694]: E0217 17:10:44.896929 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:10:50 crc kubenswrapper[4694]: I0217 17:10:50.185472 4694 generic.go:334] "Generic (PLEG): container finished" podID="c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" containerID="96b65642ada1f4c2617c65900dbc02e563897f984fe8671e368a48c02f8295f7" exitCode=0 Feb 17 17:10:50 crc kubenswrapper[4694]: I0217 17:10:50.185622 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" event={"ID":"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d","Type":"ContainerDied","Data":"96b65642ada1f4c2617c65900dbc02e563897f984fe8671e368a48c02f8295f7"} Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.629628 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.650888 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory\") pod \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.651059 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbx6c\" (UniqueName: \"kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c\") pod \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.651127 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle\") pod \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.651163 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam\") pod \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\" (UID: \"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d\") " Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.656404 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c" (OuterVolumeSpecName: "kube-api-access-tbx6c") pod "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" (UID: "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d"). InnerVolumeSpecName "kube-api-access-tbx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.658853 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" (UID: "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.684341 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" (UID: "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.691740 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory" (OuterVolumeSpecName: "inventory") pod "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" (UID: "c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.753052 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.753092 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.753106 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbx6c\" (UniqueName: \"kubernetes.io/projected/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-kube-api-access-tbx6c\") on node \"crc\" DevicePath \"\"" Feb 17 17:10:51 crc kubenswrapper[4694]: I0217 17:10:51.753123 4694 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.208018 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" event={"ID":"c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d","Type":"ContainerDied","Data":"484a9051c5c831f53a01d048303156b97fb96bbb585eee8ee7e0953b80926d74"} Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.208068 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484a9051c5c831f53a01d048303156b97fb96bbb585eee8ee7e0953b80926d74" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.208159 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.302184 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr"] Feb 17 17:10:52 crc kubenswrapper[4694]: E0217 17:10:52.302982 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.303008 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.303260 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.304316 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.306273 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.306278 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.306410 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.308104 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.313508 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr"] Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.471077 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.471186 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltl8q\" (UniqueName: \"kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.471326 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.573210 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.573325 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.573397 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltl8q\" (UniqueName: \"kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.577839 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.591792 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.593428 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltl8q\" (UniqueName: \"kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lbchr\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:52 crc kubenswrapper[4694]: I0217 17:10:52.619818 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:10:53 crc kubenswrapper[4694]: I0217 17:10:53.138307 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr"] Feb 17 17:10:53 crc kubenswrapper[4694]: I0217 17:10:53.145770 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:10:53 crc kubenswrapper[4694]: I0217 17:10:53.219398 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" event={"ID":"ef5a14b7-490f-48b6-a150-6437a2a18fda","Type":"ContainerStarted","Data":"52c2a2f88524706d8f43046ba636a88d444fc6a729bb6bed3cf8435c22620f17"} Feb 17 17:10:54 crc kubenswrapper[4694]: I0217 17:10:54.230701 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" event={"ID":"ef5a14b7-490f-48b6-a150-6437a2a18fda","Type":"ContainerStarted","Data":"25070e4c7ed7893030ba81bf3e0a638823f9ae8ebc087e1f3775c854484c7bf3"} Feb 17 17:10:54 crc kubenswrapper[4694]: I0217 17:10:54.252466 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" podStartSLOduration=1.960680226 podStartE2EDuration="2.252449651s" podCreationTimestamp="2026-02-17 17:10:52 +0000 UTC" firstStartedPulling="2026-02-17 17:10:53.145474047 +0000 UTC m=+1720.902549381" lastFinishedPulling="2026-02-17 17:10:53.437243482 +0000 UTC m=+1721.194318806" observedRunningTime="2026-02-17 17:10:54.248486052 +0000 UTC m=+1722.005561386" watchObservedRunningTime="2026-02-17 17:10:54.252449651 +0000 UTC m=+1722.009524985" Feb 17 17:10:58 crc kubenswrapper[4694]: I0217 17:10:58.895706 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:10:58 crc kubenswrapper[4694]: E0217 17:10:58.897765 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:11:13 crc kubenswrapper[4694]: I0217 17:11:13.895804 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:11:13 crc kubenswrapper[4694]: E0217 17:11:13.898330 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:11:24 crc kubenswrapper[4694]: I0217 17:11:24.781699 4694 scope.go:117] "RemoveContainer" containerID="0ebd00181ecb4155e0a3fb0cdbc3e85a6543e82a12259197509d3dd99a55cb61" Feb 17 17:11:25 crc kubenswrapper[4694]: I0217 17:11:25.895801 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:11:25 crc kubenswrapper[4694]: E0217 17:11:25.896390 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:11:38 crc kubenswrapper[4694]: I0217 17:11:38.895828 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:11:38 crc kubenswrapper[4694]: E0217 17:11:38.896630 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:11:50 crc kubenswrapper[4694]: I0217 17:11:50.896031 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:11:50 crc kubenswrapper[4694]: E0217 17:11:50.896829 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:12:02 crc kubenswrapper[4694]: I0217 17:12:02.902401 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:12:02 crc kubenswrapper[4694]: E0217 17:12:02.903232 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:12:04 crc kubenswrapper[4694]: I0217 17:12:04.048364 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xtd7t"] Feb 17 17:12:04 crc kubenswrapper[4694]: I0217 17:12:04.057725 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xtd7t"] Feb 17 17:12:04 crc kubenswrapper[4694]: I0217 17:12:04.909178 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0023fe-7368-42ed-a175-487cd538b39e" path="/var/lib/kubelet/pods/1d0023fe-7368-42ed-a175-487cd538b39e/volumes" Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.036661 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f1bb-account-create-update-v4tm7"] Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.047936 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f1bb-account-create-update-v4tm7"] Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.057935 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g4wjz"] Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.068309 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g4wjz"] Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.078343 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db13-account-create-update-qrdw6"] Feb 17 17:12:05 crc kubenswrapper[4694]: I0217 17:12:05.086938 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db13-account-create-update-qrdw6"] Feb 17 17:12:06 crc kubenswrapper[4694]: I0217 17:12:06.908582 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b858a6-af85-4112-a8bf-1ed44b0004e7" path="/var/lib/kubelet/pods/84b858a6-af85-4112-a8bf-1ed44b0004e7/volumes" Feb 17 17:12:06 crc kubenswrapper[4694]: I0217 17:12:06.909358 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874f9975-fa4a-41b7-9aee-2010fb88447f" path="/var/lib/kubelet/pods/874f9975-fa4a-41b7-9aee-2010fb88447f/volumes" Feb 17 17:12:06 crc kubenswrapper[4694]: I0217 17:12:06.909990 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b48ab6-ed97-4844-bcf5-126f60c9b9a3" path="/var/lib/kubelet/pods/b7b48ab6-ed97-4844-bcf5-126f60c9b9a3/volumes" Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.038512 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7n5vf"] Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.048261 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6087-account-create-update-v6hqf"] Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.059990 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6087-account-create-update-v6hqf"] Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.070710 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7n5vf"] Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.912359 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a2e4cd-27d5-4b34-8b28-61db84dafc59" path="/var/lib/kubelet/pods/79a2e4cd-27d5-4b34-8b28-61db84dafc59/volumes" Feb 17 17:12:08 crc kubenswrapper[4694]: I0217 17:12:08.913969 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f94c8d-bdff-499c-8e4f-1c5a022f328f" path="/var/lib/kubelet/pods/e1f94c8d-bdff-499c-8e4f-1c5a022f328f/volumes" Feb 17 17:12:14 crc kubenswrapper[4694]: I0217 17:12:14.897283 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:12:14 crc kubenswrapper[4694]: E0217 17:12:14.898261 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:12:17 crc kubenswrapper[4694]: I0217 17:12:17.448866 4694 generic.go:334] "Generic (PLEG): container finished" podID="ef5a14b7-490f-48b6-a150-6437a2a18fda" containerID="25070e4c7ed7893030ba81bf3e0a638823f9ae8ebc087e1f3775c854484c7bf3" exitCode=0 Feb 17 17:12:17 crc kubenswrapper[4694]: I0217 17:12:17.448948 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" event={"ID":"ef5a14b7-490f-48b6-a150-6437a2a18fda","Type":"ContainerDied","Data":"25070e4c7ed7893030ba81bf3e0a638823f9ae8ebc087e1f3775c854484c7bf3"} Feb 17 17:12:18 crc kubenswrapper[4694]: I0217 17:12:18.853453 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.038857 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory\") pod \"ef5a14b7-490f-48b6-a150-6437a2a18fda\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.038943 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltl8q\" (UniqueName: \"kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q\") pod \"ef5a14b7-490f-48b6-a150-6437a2a18fda\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.039044 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam\") pod \"ef5a14b7-490f-48b6-a150-6437a2a18fda\" (UID: \"ef5a14b7-490f-48b6-a150-6437a2a18fda\") " Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.044445 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q" (OuterVolumeSpecName: "kube-api-access-ltl8q") pod "ef5a14b7-490f-48b6-a150-6437a2a18fda" (UID: "ef5a14b7-490f-48b6-a150-6437a2a18fda"). InnerVolumeSpecName "kube-api-access-ltl8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.066096 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory" (OuterVolumeSpecName: "inventory") pod "ef5a14b7-490f-48b6-a150-6437a2a18fda" (UID: "ef5a14b7-490f-48b6-a150-6437a2a18fda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.067708 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ef5a14b7-490f-48b6-a150-6437a2a18fda" (UID: "ef5a14b7-490f-48b6-a150-6437a2a18fda"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.141548 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltl8q\" (UniqueName: \"kubernetes.io/projected/ef5a14b7-490f-48b6-a150-6437a2a18fda-kube-api-access-ltl8q\") on node \"crc\" DevicePath \"\"" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.141619 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.141640 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5a14b7-490f-48b6-a150-6437a2a18fda-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.469040 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" event={"ID":"ef5a14b7-490f-48b6-a150-6437a2a18fda","Type":"ContainerDied","Data":"52c2a2f88524706d8f43046ba636a88d444fc6a729bb6bed3cf8435c22620f17"} Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.469090 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c2a2f88524706d8f43046ba636a88d444fc6a729bb6bed3cf8435c22620f17" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.469093 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lbchr" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.563507 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4"] Feb 17 17:12:19 crc kubenswrapper[4694]: E0217 17:12:19.564224 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a14b7-490f-48b6-a150-6437a2a18fda" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.564255 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a14b7-490f-48b6-a150-6437a2a18fda" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.564451 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a14b7-490f-48b6-a150-6437a2a18fda" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.565240 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.570356 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.570562 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.571302 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.571420 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.580422 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4"] Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.650508 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.650811 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.650887 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5xnm\" (UniqueName: \"kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.752021 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.752076 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5xnm\" (UniqueName: \"kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.752113 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.756116 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.756132 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.783536 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5xnm\" (UniqueName: \"kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:19 crc kubenswrapper[4694]: I0217 17:12:19.884443 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:12:20 crc kubenswrapper[4694]: I0217 17:12:20.408400 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4"] Feb 17 17:12:20 crc kubenswrapper[4694]: W0217 17:12:20.410986 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1e17e7_cd69_4f0f_8e3f_e36399e001a8.slice/crio-9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9 WatchSource:0}: Error finding container 9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9: Status 404 returned error can't find the container with id 9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9 Feb 17 17:12:20 crc kubenswrapper[4694]: I0217 17:12:20.478410 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" event={"ID":"af1e17e7-cd69-4f0f-8e3f-e36399e001a8","Type":"ContainerStarted","Data":"9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9"} Feb 17 17:12:21 crc kubenswrapper[4694]: I0217 17:12:21.487305 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" event={"ID":"af1e17e7-cd69-4f0f-8e3f-e36399e001a8","Type":"ContainerStarted","Data":"493a6d05a0126899c2eb5d376471f5707ed59c156ba2d077c86be07082964787"} Feb 17 17:12:21 crc kubenswrapper[4694]: I0217 17:12:21.505653 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" podStartSLOduration=2.283322223 podStartE2EDuration="2.505633391s" podCreationTimestamp="2026-02-17 17:12:19 +0000 UTC" firstStartedPulling="2026-02-17 17:12:20.412906643 +0000 UTC m=+1808.169981967" lastFinishedPulling="2026-02-17 17:12:20.635217811 +0000 UTC m=+1808.392293135" observedRunningTime="2026-02-17 17:12:21.501409526 +0000 UTC m=+1809.258484850" watchObservedRunningTime="2026-02-17 17:12:21.505633391 +0000 UTC m=+1809.262708715" Feb 17 17:12:24 crc kubenswrapper[4694]: I0217 17:12:24.838277 4694 scope.go:117] "RemoveContainer" containerID="98162195a19395ea4a0720118004d57cbaa07b39b01083f919adf7834060a6f8" Feb 17 17:12:24 crc kubenswrapper[4694]: I0217 17:12:24.867033 4694 scope.go:117] "RemoveContainer" containerID="ceb990be6949e681bb1fea1b62505171ada5b24ced52ef07719bd8ff02456cbc" Feb 17 17:12:24 crc kubenswrapper[4694]: I0217 17:12:24.910386 4694 scope.go:117] "RemoveContainer" containerID="750d66418942f40339f05c0fa3b216d6baaec3352af5b871e96d1a2458362721" Feb 17 17:12:24 crc kubenswrapper[4694]: I0217 17:12:24.958514 4694 scope.go:117] "RemoveContainer" containerID="e3742b0fad3479cfb4be13b484d19a01a24540b2f5f3c93e8b6eb8e6433fec80" Feb 17 17:12:25 crc kubenswrapper[4694]: I0217 17:12:25.003127 4694 scope.go:117] "RemoveContainer" containerID="3423ecb763c09df03daad0a422cd33e28707ae6d198eba5a1c2b046e8df06eeb" Feb 17 17:12:25 crc kubenswrapper[4694]: I0217 17:12:25.052939 4694 scope.go:117] "RemoveContainer" containerID="3ab4c2cee84d3c79918f010d3921fed97bbece35e9d8f9503eac949ac85df9d9" Feb 17 17:12:25 crc kubenswrapper[4694]: I0217 17:12:25.896467 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:12:25 crc kubenswrapper[4694]: E0217 17:12:25.896873 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:12:27 crc kubenswrapper[4694]: I0217 17:12:27.040299 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mmbl9"] Feb 17 17:12:27 crc kubenswrapper[4694]: I0217 17:12:27.051884 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mmbl9"] Feb 17 17:12:28 crc kubenswrapper[4694]: I0217 17:12:28.907858 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdf9bc5-0b05-4382-a139-24f51593f749" path="/var/lib/kubelet/pods/3bdf9bc5-0b05-4382-a139-24f51593f749/volumes" Feb 17 17:12:32 crc kubenswrapper[4694]: I0217 17:12:32.034823 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qkvkc"] Feb 17 17:12:32 crc kubenswrapper[4694]: I0217 17:12:32.046481 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qkvkc"] Feb 17 17:12:32 crc kubenswrapper[4694]: I0217 17:12:32.904522 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3b5789-af72-49a1-8003-151eb88df06e" path="/var/lib/kubelet/pods/5b3b5789-af72-49a1-8003-151eb88df06e/volumes" Feb 17 17:12:36 crc kubenswrapper[4694]: I0217 17:12:36.896000 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:12:36 crc kubenswrapper[4694]: E0217 17:12:36.896773 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.049194 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bxnn6"] Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.057267 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l2cxb"] Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.065472 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bxnn6"] Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.073554 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l2cxb"] Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.081433 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2194-account-create-update-52gm2"] Feb 17 17:12:41 crc kubenswrapper[4694]: I0217 17:12:41.090463 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2194-account-create-update-52gm2"] Feb 17 17:12:42 crc kubenswrapper[4694]: I0217 17:12:42.915906 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2483097f-291b-41d8-9428-6bca4956ae91" path="/var/lib/kubelet/pods/2483097f-291b-41d8-9428-6bca4956ae91/volumes" Feb 17 17:12:42 crc kubenswrapper[4694]: I0217 17:12:42.918067 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7502f1-dba9-43c8-81b3-8516714bca75" path="/var/lib/kubelet/pods/7e7502f1-dba9-43c8-81b3-8516714bca75/volumes" Feb 17 17:12:42 crc kubenswrapper[4694]: I0217 17:12:42.919532 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42d3f8d-87a8-4499-a06e-4c0bd452ba66" path="/var/lib/kubelet/pods/a42d3f8d-87a8-4499-a06e-4c0bd452ba66/volumes" Feb 17 17:12:44 crc kubenswrapper[4694]: I0217 17:12:44.027830 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hgsbq"] Feb 17 17:12:44 crc kubenswrapper[4694]: I0217 17:12:44.037949 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hgsbq"] Feb 17 17:12:44 crc kubenswrapper[4694]: I0217 17:12:44.905686 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01db982-56b8-4f3c-98f8-c21954640fce" path="/var/lib/kubelet/pods/b01db982-56b8-4f3c-98f8-c21954640fce/volumes" Feb 17 17:12:45 crc kubenswrapper[4694]: I0217 17:12:45.034539 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7ac7-account-create-update-nd26q"] Feb 17 17:12:45 crc kubenswrapper[4694]: I0217 17:12:45.045603 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7ac7-account-create-update-nd26q"] Feb 17 17:12:45 crc kubenswrapper[4694]: I0217 17:12:45.055311 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0b44-account-create-update-bhwxj"] Feb 17 17:12:45 crc kubenswrapper[4694]: I0217 17:12:45.064220 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0b44-account-create-update-bhwxj"] Feb 17 17:12:46 crc kubenswrapper[4694]: I0217 17:12:46.905535 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04936e25-daf1-4d3a-8256-8e1c127688cb" path="/var/lib/kubelet/pods/04936e25-daf1-4d3a-8256-8e1c127688cb/volumes" Feb 17 17:12:46 crc kubenswrapper[4694]: I0217 17:12:46.906507 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ebd2d40-ad72-4e4d-9264-890f92641e9d" path="/var/lib/kubelet/pods/3ebd2d40-ad72-4e4d-9264-890f92641e9d/volumes" Feb 17 17:12:49 crc kubenswrapper[4694]: I0217 17:12:49.039984 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-w7fks"] Feb 17 17:12:49 crc kubenswrapper[4694]: I0217 17:12:49.061974 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-w7fks"] Feb 17 17:12:50 crc kubenswrapper[4694]: I0217 17:12:50.903849 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4782973d-718d-4b2a-9b1e-84dfcfbafced" path="/var/lib/kubelet/pods/4782973d-718d-4b2a-9b1e-84dfcfbafced/volumes" Feb 17 17:12:51 crc kubenswrapper[4694]: I0217 17:12:51.896023 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:12:51 crc kubenswrapper[4694]: E0217 17:12:51.896430 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:13:05 crc kubenswrapper[4694]: I0217 17:13:05.895223 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:13:05 crc kubenswrapper[4694]: E0217 17:13:05.897063 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:13:16 crc kubenswrapper[4694]: I0217 17:13:16.895988 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:13:17 crc kubenswrapper[4694]: I0217 17:13:17.995404 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02"} Feb 17 17:13:23 crc kubenswrapper[4694]: I0217 17:13:23.051932 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dgmfl"] Feb 17 17:13:23 crc kubenswrapper[4694]: I0217 17:13:23.060466 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dgmfl"] Feb 17 17:13:24 crc kubenswrapper[4694]: I0217 17:13:24.906209 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ec1579-807c-4af0-8332-9a52733beed0" path="/var/lib/kubelet/pods/15ec1579-807c-4af0-8332-9a52733beed0/volumes" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.193329 4694 scope.go:117] "RemoveContainer" containerID="39724c483ff231898372944df1f10912b1ad27b5bcae744ea3bdc174bada0df2" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.248041 4694 scope.go:117] "RemoveContainer" containerID="af629601be94a0118df68580f2a3373bbec2327d97ae533f2f2b542bd9b3c151" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.270710 4694 scope.go:117] "RemoveContainer" containerID="bae03796d1c540d22f8d7c5755ca9d5eea84ef6055d2beadf7b6512fe0b3fbb6" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.316661 4694 scope.go:117] "RemoveContainer" containerID="f1753018eaf4672a884f59e1b88afbc63d9da89e9860846e99a377180d8f762d" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.392570 4694 scope.go:117] "RemoveContainer" containerID="c5fcd645551e1c3cdb5d945297b25cba32fce983fcf514f2bd71f5eaff0f4d21" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.442329 4694 scope.go:117] "RemoveContainer" containerID="7a1057d50737cbb1ff2f673544f86aedd1f0d217a56937acde54e48d67956437" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.464103 4694 scope.go:117] "RemoveContainer" containerID="61643b7fad49604a99a262c89b7cc43143f1d16a9f3497c3153e275b875a7902" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.481262 4694 scope.go:117] "RemoveContainer" containerID="bb6bcb4b5ef7af6e3a457dfc6b56d879f42e873b5acf3a9dced8cf7bad135dc8" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.498649 4694 scope.go:117] "RemoveContainer" containerID="8f91331c2c2a4ce787273b6090b2a2ad65306d509a6a3576b9799dff4ecab789" Feb 17 17:13:25 crc kubenswrapper[4694]: I0217 17:13:25.518474 4694 scope.go:117] "RemoveContainer" containerID="27a0c7908fab8f0da115dfbefa8e382675b0af1d4083cc91b4ac077713e22a89" Feb 17 17:13:29 crc kubenswrapper[4694]: I0217 17:13:29.036121 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-swggc"] Feb 17 17:13:29 crc kubenswrapper[4694]: I0217 17:13:29.047859 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-swggc"] Feb 17 17:13:30 crc kubenswrapper[4694]: I0217 17:13:30.907300 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8994bf6c-4617-4837-a8a2-4d399f187abb" path="/var/lib/kubelet/pods/8994bf6c-4617-4837-a8a2-4d399f187abb/volumes" Feb 17 17:13:32 crc kubenswrapper[4694]: I0217 17:13:32.128098 4694 generic.go:334] "Generic (PLEG): container finished" podID="af1e17e7-cd69-4f0f-8e3f-e36399e001a8" containerID="493a6d05a0126899c2eb5d376471f5707ed59c156ba2d077c86be07082964787" exitCode=0 Feb 17 17:13:32 crc kubenswrapper[4694]: I0217 17:13:32.128238 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" event={"ID":"af1e17e7-cd69-4f0f-8e3f-e36399e001a8","Type":"ContainerDied","Data":"493a6d05a0126899c2eb5d376471f5707ed59c156ba2d077c86be07082964787"} Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.520794 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.620205 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory\") pod \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.620358 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5xnm\" (UniqueName: \"kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm\") pod \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.621023 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam\") pod \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\" (UID: \"af1e17e7-cd69-4f0f-8e3f-e36399e001a8\") " Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.625720 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm" (OuterVolumeSpecName: "kube-api-access-j5xnm") pod "af1e17e7-cd69-4f0f-8e3f-e36399e001a8" (UID: "af1e17e7-cd69-4f0f-8e3f-e36399e001a8"). InnerVolumeSpecName "kube-api-access-j5xnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.646312 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af1e17e7-cd69-4f0f-8e3f-e36399e001a8" (UID: "af1e17e7-cd69-4f0f-8e3f-e36399e001a8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.648637 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory" (OuterVolumeSpecName: "inventory") pod "af1e17e7-cd69-4f0f-8e3f-e36399e001a8" (UID: "af1e17e7-cd69-4f0f-8e3f-e36399e001a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.722589 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.722639 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5xnm\" (UniqueName: \"kubernetes.io/projected/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-kube-api-access-j5xnm\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:33 crc kubenswrapper[4694]: I0217 17:13:33.722652 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af1e17e7-cd69-4f0f-8e3f-e36399e001a8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.035190 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hhvxg"] Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.043270 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hhvxg"] Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.144709 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" event={"ID":"af1e17e7-cd69-4f0f-8e3f-e36399e001a8","Type":"ContainerDied","Data":"9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9"} Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.144752 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9434b7bfc45f093429e4e53d0f0df784469a0b75d57ca0e03f1f754d05de48d9" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.144773 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.228712 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv"] Feb 17 17:13:34 crc kubenswrapper[4694]: E0217 17:13:34.229128 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1e17e7-cd69-4f0f-8e3f-e36399e001a8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.229150 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1e17e7-cd69-4f0f-8e3f-e36399e001a8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.229366 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1e17e7-cd69-4f0f-8e3f-e36399e001a8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.230036 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.232395 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.232595 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.232660 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.235769 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.239517 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv"] Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.333408 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrh4\" (UniqueName: \"kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.333506 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.334033 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.439366 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.439861 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrh4\" (UniqueName: \"kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.439931 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.445245 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.446726 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.464559 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrh4\" (UniqueName: \"kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.555376 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:34 crc kubenswrapper[4694]: I0217 17:13:34.905491 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56edac5-8790-4475-ac42-c958ef4e523a" path="/var/lib/kubelet/pods/a56edac5-8790-4475-ac42-c958ef4e523a/volumes" Feb 17 17:13:35 crc kubenswrapper[4694]: I0217 17:13:35.058224 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv"] Feb 17 17:13:35 crc kubenswrapper[4694]: I0217 17:13:35.153349 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" event={"ID":"bfbc588a-92ae-49ee-bcad-433dc28ecad7","Type":"ContainerStarted","Data":"db0e3d7b9cbe2a474e0f877728048921fdda385a5318e06b07eb4a529231b414"} Feb 17 17:13:36 crc kubenswrapper[4694]: I0217 17:13:36.163391 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" event={"ID":"bfbc588a-92ae-49ee-bcad-433dc28ecad7","Type":"ContainerStarted","Data":"28bd1eb4eada2c15e15ced2050c75d32c580094a4ff6f88f5c25418530eb82f6"} Feb 17 17:13:36 crc kubenswrapper[4694]: I0217 17:13:36.185164 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" podStartSLOduration=1.966628167 podStartE2EDuration="2.18514077s" podCreationTimestamp="2026-02-17 17:13:34 +0000 UTC" firstStartedPulling="2026-02-17 17:13:35.06579606 +0000 UTC m=+1882.822871384" lastFinishedPulling="2026-02-17 17:13:35.284308673 +0000 UTC m=+1883.041383987" observedRunningTime="2026-02-17 17:13:36.176819903 +0000 UTC m=+1883.933895237" watchObservedRunningTime="2026-02-17 17:13:36.18514077 +0000 UTC m=+1883.942216104" Feb 17 17:13:41 crc kubenswrapper[4694]: I0217 17:13:41.032842 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-c88gm"] Feb 17 17:13:41 crc kubenswrapper[4694]: I0217 17:13:41.040525 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-c88gm"] Feb 17 17:13:41 crc kubenswrapper[4694]: I0217 17:13:41.213680 4694 generic.go:334] "Generic (PLEG): container finished" podID="bfbc588a-92ae-49ee-bcad-433dc28ecad7" containerID="28bd1eb4eada2c15e15ced2050c75d32c580094a4ff6f88f5c25418530eb82f6" exitCode=0 Feb 17 17:13:41 crc kubenswrapper[4694]: I0217 17:13:41.213882 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" event={"ID":"bfbc588a-92ae-49ee-bcad-433dc28ecad7","Type":"ContainerDied","Data":"28bd1eb4eada2c15e15ced2050c75d32c580094a4ff6f88f5c25418530eb82f6"} Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.608797 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.808353 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xrh4\" (UniqueName: \"kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4\") pod \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.808643 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam\") pod \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.808792 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory\") pod \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\" (UID: \"bfbc588a-92ae-49ee-bcad-433dc28ecad7\") " Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.818225 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4" (OuterVolumeSpecName: "kube-api-access-5xrh4") pod "bfbc588a-92ae-49ee-bcad-433dc28ecad7" (UID: "bfbc588a-92ae-49ee-bcad-433dc28ecad7"). InnerVolumeSpecName "kube-api-access-5xrh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.836775 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfbc588a-92ae-49ee-bcad-433dc28ecad7" (UID: "bfbc588a-92ae-49ee-bcad-433dc28ecad7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.845169 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory" (OuterVolumeSpecName: "inventory") pod "bfbc588a-92ae-49ee-bcad-433dc28ecad7" (UID: "bfbc588a-92ae-49ee-bcad-433dc28ecad7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.909539 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaac0bee-f5f9-49c0-b880-6c57d412972e" path="/var/lib/kubelet/pods/aaac0bee-f5f9-49c0-b880-6c57d412972e/volumes" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.911463 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.911487 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xrh4\" (UniqueName: \"kubernetes.io/projected/bfbc588a-92ae-49ee-bcad-433dc28ecad7-kube-api-access-5xrh4\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:42 crc kubenswrapper[4694]: I0217 17:13:42.911498 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfbc588a-92ae-49ee-bcad-433dc28ecad7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.231197 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" event={"ID":"bfbc588a-92ae-49ee-bcad-433dc28ecad7","Type":"ContainerDied","Data":"db0e3d7b9cbe2a474e0f877728048921fdda385a5318e06b07eb4a529231b414"} Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.231235 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db0e3d7b9cbe2a474e0f877728048921fdda385a5318e06b07eb4a529231b414" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.231298 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.380809 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8"] Feb 17 17:13:43 crc kubenswrapper[4694]: E0217 17:13:43.381186 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbc588a-92ae-49ee-bcad-433dc28ecad7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.381203 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbc588a-92ae-49ee-bcad-433dc28ecad7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.381398 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbc588a-92ae-49ee-bcad-433dc28ecad7" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.381983 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.384109 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.384424 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.384438 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.387428 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.398271 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8"] Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.521876 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4lr\" (UniqueName: \"kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.522017 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.522042 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.624074 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4lr\" (UniqueName: \"kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.624178 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.624200 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.627908 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.628166 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.644570 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4lr\" (UniqueName: \"kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwrb8\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:43 crc kubenswrapper[4694]: I0217 17:13:43.703750 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:13:44 crc kubenswrapper[4694]: I0217 17:13:44.186749 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8"] Feb 17 17:13:44 crc kubenswrapper[4694]: I0217 17:13:44.239257 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" event={"ID":"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35","Type":"ContainerStarted","Data":"cf27cabe05494d468423dc4c8502542558b9215f97b95ef03f2cb55cf8d69ea6"} Feb 17 17:13:45 crc kubenswrapper[4694]: I0217 17:13:45.249973 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" event={"ID":"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35","Type":"ContainerStarted","Data":"c90bc0ed7952c4e4fcb054df10205bc8dada9eb09a7a72e4dd2588a57a083f15"} Feb 17 17:13:45 crc kubenswrapper[4694]: I0217 17:13:45.276349 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" podStartSLOduration=2.106320221 podStartE2EDuration="2.276329818s" podCreationTimestamp="2026-02-17 17:13:43 +0000 UTC" firstStartedPulling="2026-02-17 17:13:44.193522766 +0000 UTC m=+1891.950598100" lastFinishedPulling="2026-02-17 17:13:44.363532373 +0000 UTC m=+1892.120607697" observedRunningTime="2026-02-17 17:13:45.265583511 +0000 UTC m=+1893.022658835" watchObservedRunningTime="2026-02-17 17:13:45.276329818 +0000 UTC m=+1893.033405142" Feb 17 17:13:50 crc kubenswrapper[4694]: I0217 17:13:50.028767 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x8w9s"] Feb 17 17:13:50 crc kubenswrapper[4694]: I0217 17:13:50.037814 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x8w9s"] Feb 17 17:13:50 crc kubenswrapper[4694]: I0217 17:13:50.907492 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ff074b-45bc-4b82-89cf-b42f4b5991e1" path="/var/lib/kubelet/pods/a3ff074b-45bc-4b82-89cf-b42f4b5991e1/volumes" Feb 17 17:14:19 crc kubenswrapper[4694]: I0217 17:14:19.726981 4694 generic.go:334] "Generic (PLEG): container finished" podID="a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" containerID="c90bc0ed7952c4e4fcb054df10205bc8dada9eb09a7a72e4dd2588a57a083f15" exitCode=0 Feb 17 17:14:19 crc kubenswrapper[4694]: I0217 17:14:19.727033 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" event={"ID":"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35","Type":"ContainerDied","Data":"c90bc0ed7952c4e4fcb054df10205bc8dada9eb09a7a72e4dd2588a57a083f15"} Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.097880 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.231934 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam\") pod \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.231987 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m4lr\" (UniqueName: \"kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr\") pod \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.232135 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory\") pod \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\" (UID: \"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35\") " Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.238440 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr" (OuterVolumeSpecName: "kube-api-access-7m4lr") pod "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" (UID: "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35"). InnerVolumeSpecName "kube-api-access-7m4lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.261983 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory" (OuterVolumeSpecName: "inventory") pod "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" (UID: "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.263786 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" (UID: "a74ac9aa-ac88-4b13-b10b-9fe0f9195f35"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.334838 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.334903 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.334923 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m4lr\" (UniqueName: \"kubernetes.io/projected/a74ac9aa-ac88-4b13-b10b-9fe0f9195f35-kube-api-access-7m4lr\") on node \"crc\" DevicePath \"\"" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.747769 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" event={"ID":"a74ac9aa-ac88-4b13-b10b-9fe0f9195f35","Type":"ContainerDied","Data":"cf27cabe05494d468423dc4c8502542558b9215f97b95ef03f2cb55cf8d69ea6"} Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.747849 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf27cabe05494d468423dc4c8502542558b9215f97b95ef03f2cb55cf8d69ea6" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.747852 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwrb8" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.829955 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c"] Feb 17 17:14:21 crc kubenswrapper[4694]: E0217 17:14:21.830361 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.830384 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.830580 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74ac9aa-ac88-4b13-b10b-9fe0f9195f35" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.831365 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.833342 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.833534 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.833737 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.833907 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.839738 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c"] Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.946116 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.946202 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzpqv\" (UniqueName: \"kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:21 crc kubenswrapper[4694]: I0217 17:14:21.946386 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.048492 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.048589 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzpqv\" (UniqueName: \"kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.048745 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.053845 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.053882 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.067879 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzpqv\" (UniqueName: \"kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.146518 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.681907 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c"] Feb 17 17:14:22 crc kubenswrapper[4694]: W0217 17:14:22.684385 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b2a9feb_de71_42ff_b0ae_f4697f525469.slice/crio-003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367 WatchSource:0}: Error finding container 003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367: Status 404 returned error can't find the container with id 003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367 Feb 17 17:14:22 crc kubenswrapper[4694]: I0217 17:14:22.757270 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" event={"ID":"3b2a9feb-de71-42ff-b0ae-f4697f525469","Type":"ContainerStarted","Data":"003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367"} Feb 17 17:14:23 crc kubenswrapper[4694]: I0217 17:14:23.766595 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" event={"ID":"3b2a9feb-de71-42ff-b0ae-f4697f525469","Type":"ContainerStarted","Data":"8a476396024846d8239bd19883b59d07c0c050b9cb78af5241dbf12d1ef987d9"} Feb 17 17:14:23 crc kubenswrapper[4694]: I0217 17:14:23.783447 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" podStartSLOduration=2.638933117 podStartE2EDuration="2.78342697s" podCreationTimestamp="2026-02-17 17:14:21 +0000 UTC" firstStartedPulling="2026-02-17 17:14:22.686471606 +0000 UTC m=+1930.443546930" lastFinishedPulling="2026-02-17 17:14:22.830965459 +0000 UTC m=+1930.588040783" observedRunningTime="2026-02-17 17:14:23.78019779 +0000 UTC m=+1931.537273134" watchObservedRunningTime="2026-02-17 17:14:23.78342697 +0000 UTC m=+1931.540502304" Feb 17 17:14:25 crc kubenswrapper[4694]: I0217 17:14:25.702255 4694 scope.go:117] "RemoveContainer" containerID="40ee26cc8680cea303b2f2855f47dfe1afcf69bbc87b19279d146731379268e3" Feb 17 17:14:25 crc kubenswrapper[4694]: I0217 17:14:25.736590 4694 scope.go:117] "RemoveContainer" containerID="c90a18508957bd152efd082f8496d131de7ab9c842d5dc91741236c1efc161a4" Feb 17 17:14:25 crc kubenswrapper[4694]: I0217 17:14:25.768839 4694 scope.go:117] "RemoveContainer" containerID="efb4b8ceef872668e6b83e8412c1d3fd7f9b6b53acf6ff5521691ea2549711cf" Feb 17 17:14:25 crc kubenswrapper[4694]: I0217 17:14:25.829711 4694 scope.go:117] "RemoveContainer" containerID="17852fe2d3511ee9cf884df541308d754b56783c29e45e3395cd35609c84b6b3" Feb 17 17:14:32 crc kubenswrapper[4694]: I0217 17:14:32.049223 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3572-account-create-update-zzbtt"] Feb 17 17:14:32 crc kubenswrapper[4694]: I0217 17:14:32.058939 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3572-account-create-update-zzbtt"] Feb 17 17:14:32 crc kubenswrapper[4694]: I0217 17:14:32.905515 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d812e0-ed6a-4704-a149-35d388dc6d9b" path="/var/lib/kubelet/pods/99d812e0-ed6a-4704-a149-35d388dc6d9b/volumes" Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.039731 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a9d3-account-create-update-qvwcg"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.054079 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fq6xd"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.065596 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vcrmb"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.075758 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-327d-account-create-update-5mzdp"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.083468 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zgbvz"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.092458 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fq6xd"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.099744 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a9d3-account-create-update-qvwcg"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.128453 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-327d-account-create-update-5mzdp"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.131936 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zgbvz"] Feb 17 17:14:33 crc kubenswrapper[4694]: I0217 17:14:33.148377 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vcrmb"] Feb 17 17:14:34 crc kubenswrapper[4694]: I0217 17:14:34.905424 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c514806-c7d8-4b8f-ba4f-866d382f6d82" path="/var/lib/kubelet/pods/1c514806-c7d8-4b8f-ba4f-866d382f6d82/volumes" Feb 17 17:14:34 crc kubenswrapper[4694]: I0217 17:14:34.906131 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af0afc1-594e-4c0d-a74b-6fb9266a7f57" path="/var/lib/kubelet/pods/3af0afc1-594e-4c0d-a74b-6fb9266a7f57/volumes" Feb 17 17:14:34 crc kubenswrapper[4694]: I0217 17:14:34.906733 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c35aab0-3f76-440f-b2b6-eecbe7ddbff0" path="/var/lib/kubelet/pods/9c35aab0-3f76-440f-b2b6-eecbe7ddbff0/volumes" Feb 17 17:14:34 crc kubenswrapper[4694]: I0217 17:14:34.907233 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9bd5acb-d4c7-474d-b8cb-3e39de8265b9" path="/var/lib/kubelet/pods/a9bd5acb-d4c7-474d-b8cb-3e39de8265b9/volumes" Feb 17 17:14:34 crc kubenswrapper[4694]: I0217 17:14:34.908203 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2af48f7-ad7a-4692-a40c-c6d30bbe2402" path="/var/lib/kubelet/pods/f2af48f7-ad7a-4692-a40c-c6d30bbe2402/volumes" Feb 17 17:14:55 crc kubenswrapper[4694]: I0217 17:14:55.074843 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fpg7"] Feb 17 17:14:55 crc kubenswrapper[4694]: I0217 17:14:55.082158 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8fpg7"] Feb 17 17:14:56 crc kubenswrapper[4694]: I0217 17:14:56.906655 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146d2c58-f359-4372-810a-7ab64e022ad1" path="/var/lib/kubelet/pods/146d2c58-f359-4372-810a-7ab64e022ad1/volumes" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.141322 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6"] Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.143191 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.145482 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.147314 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.148836 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6"] Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.267976 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.268059 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.268145 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb599\" (UniqueName: \"kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.369801 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.370091 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.370288 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb599\" (UniqueName: \"kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.371038 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.388989 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.393787 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb599\" (UniqueName: \"kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599\") pod \"collect-profiles-29522475-cmwp6\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.468715 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:00 crc kubenswrapper[4694]: I0217 17:15:00.933070 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6"] Feb 17 17:15:01 crc kubenswrapper[4694]: I0217 17:15:01.076076 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" event={"ID":"6afa0b10-5605-4f37-a69c-d2a973dace47","Type":"ContainerStarted","Data":"a833b12841bdb378e30a3c372dafcaf2f6e372659f2c1f6d99335687cd1d9c83"} Feb 17 17:15:02 crc kubenswrapper[4694]: I0217 17:15:02.091006 4694 generic.go:334] "Generic (PLEG): container finished" podID="6afa0b10-5605-4f37-a69c-d2a973dace47" containerID="f37c7b9b349917b530f8278def5b4fe284ce002f14e2c3a912e8bd72bb099db4" exitCode=0 Feb 17 17:15:02 crc kubenswrapper[4694]: I0217 17:15:02.091107 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" event={"ID":"6afa0b10-5605-4f37-a69c-d2a973dace47","Type":"ContainerDied","Data":"f37c7b9b349917b530f8278def5b4fe284ce002f14e2c3a912e8bd72bb099db4"} Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.464590 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.527973 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume\") pod \"6afa0b10-5605-4f37-a69c-d2a973dace47\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.528142 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb599\" (UniqueName: \"kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599\") pod \"6afa0b10-5605-4f37-a69c-d2a973dace47\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.529590 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume\") pod \"6afa0b10-5605-4f37-a69c-d2a973dace47\" (UID: \"6afa0b10-5605-4f37-a69c-d2a973dace47\") " Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.530248 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume" (OuterVolumeSpecName: "config-volume") pod "6afa0b10-5605-4f37-a69c-d2a973dace47" (UID: "6afa0b10-5605-4f37-a69c-d2a973dace47"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.530683 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa0b10-5605-4f37-a69c-d2a973dace47-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.539425 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599" (OuterVolumeSpecName: "kube-api-access-sb599") pod "6afa0b10-5605-4f37-a69c-d2a973dace47" (UID: "6afa0b10-5605-4f37-a69c-d2a973dace47"). InnerVolumeSpecName "kube-api-access-sb599". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.539816 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6afa0b10-5605-4f37-a69c-d2a973dace47" (UID: "6afa0b10-5605-4f37-a69c-d2a973dace47"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.633995 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6afa0b10-5605-4f37-a69c-d2a973dace47-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:03 crc kubenswrapper[4694]: I0217 17:15:03.634042 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb599\" (UniqueName: \"kubernetes.io/projected/6afa0b10-5605-4f37-a69c-d2a973dace47-kube-api-access-sb599\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.136280 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" event={"ID":"6afa0b10-5605-4f37-a69c-d2a973dace47","Type":"ContainerDied","Data":"a833b12841bdb378e30a3c372dafcaf2f6e372659f2c1f6d99335687cd1d9c83"} Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.136354 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a833b12841bdb378e30a3c372dafcaf2f6e372659f2c1f6d99335687cd1d9c83" Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.136307 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522475-cmwp6" Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.537721 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw"] Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.549027 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522430-mx6nw"] Feb 17 17:15:04 crc kubenswrapper[4694]: I0217 17:15:04.908414 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ac4a19-2aa4-44da-ac5d-4df6622094b2" path="/var/lib/kubelet/pods/04ac4a19-2aa4-44da-ac5d-4df6622094b2/volumes" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.696290 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:06 crc kubenswrapper[4694]: E0217 17:15:06.697114 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afa0b10-5605-4f37-a69c-d2a973dace47" containerName="collect-profiles" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.697131 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afa0b10-5605-4f37-a69c-d2a973dace47" containerName="collect-profiles" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.697347 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afa0b10-5605-4f37-a69c-d2a973dace47" containerName="collect-profiles" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.699046 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.711992 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.804226 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.804302 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snnfj\" (UniqueName: \"kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.804383 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.905477 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.905526 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snnfj\" (UniqueName: \"kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.905597 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.905998 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.906031 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:06 crc kubenswrapper[4694]: I0217 17:15:06.926304 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snnfj\" (UniqueName: \"kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj\") pod \"redhat-marketplace-4dw6w\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:07 crc kubenswrapper[4694]: I0217 17:15:07.031782 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:07 crc kubenswrapper[4694]: I0217 17:15:07.520347 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:07 crc kubenswrapper[4694]: E0217 17:15:07.941004 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a00290_b190_4619_a733_74a2f1db86d3.slice/crio-conmon-a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a00290_b190_4619_a733_74a2f1db86d3.slice/crio-a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:15:08 crc kubenswrapper[4694]: I0217 17:15:08.176583 4694 generic.go:334] "Generic (PLEG): container finished" podID="94a00290-b190-4619-a733-74a2f1db86d3" containerID="a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24" exitCode=0 Feb 17 17:15:08 crc kubenswrapper[4694]: I0217 17:15:08.176753 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerDied","Data":"a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24"} Feb 17 17:15:08 crc kubenswrapper[4694]: I0217 17:15:08.176789 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerStarted","Data":"6b415dad9360164ce651fa50f73664a9036773196b96761a3ef5ab4824d3a539"} Feb 17 17:15:08 crc kubenswrapper[4694]: I0217 17:15:08.182711 4694 generic.go:334] "Generic (PLEG): container finished" podID="3b2a9feb-de71-42ff-b0ae-f4697f525469" containerID="8a476396024846d8239bd19883b59d07c0c050b9cb78af5241dbf12d1ef987d9" exitCode=0 Feb 17 17:15:08 crc kubenswrapper[4694]: I0217 17:15:08.182763 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" event={"ID":"3b2a9feb-de71-42ff-b0ae-f4697f525469","Type":"ContainerDied","Data":"8a476396024846d8239bd19883b59d07c0c050b9cb78af5241dbf12d1ef987d9"} Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.612978 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.774424 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzpqv\" (UniqueName: \"kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv\") pod \"3b2a9feb-de71-42ff-b0ae-f4697f525469\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.774712 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory\") pod \"3b2a9feb-de71-42ff-b0ae-f4697f525469\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.774822 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam\") pod \"3b2a9feb-de71-42ff-b0ae-f4697f525469\" (UID: \"3b2a9feb-de71-42ff-b0ae-f4697f525469\") " Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.787856 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv" (OuterVolumeSpecName: "kube-api-access-qzpqv") pod "3b2a9feb-de71-42ff-b0ae-f4697f525469" (UID: "3b2a9feb-de71-42ff-b0ae-f4697f525469"). InnerVolumeSpecName "kube-api-access-qzpqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.801805 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b2a9feb-de71-42ff-b0ae-f4697f525469" (UID: "3b2a9feb-de71-42ff-b0ae-f4697f525469"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.807752 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory" (OuterVolumeSpecName: "inventory") pod "3b2a9feb-de71-42ff-b0ae-f4697f525469" (UID: "3b2a9feb-de71-42ff-b0ae-f4697f525469"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.876983 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzpqv\" (UniqueName: \"kubernetes.io/projected/3b2a9feb-de71-42ff-b0ae-f4697f525469-kube-api-access-qzpqv\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.877039 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:09 crc kubenswrapper[4694]: I0217 17:15:09.877051 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b2a9feb-de71-42ff-b0ae-f4697f525469-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.202030 4694 generic.go:334] "Generic (PLEG): container finished" podID="94a00290-b190-4619-a733-74a2f1db86d3" containerID="be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07" exitCode=0 Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.202121 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerDied","Data":"be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07"} Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.204277 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" event={"ID":"3b2a9feb-de71-42ff-b0ae-f4697f525469","Type":"ContainerDied","Data":"003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367"} Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.204309 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003478fcb3a72ca2c35fe8b811d828b37bde746814c9854693221aba9fb3d367" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.204357 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.292532 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vsfcq"] Feb 17 17:15:10 crc kubenswrapper[4694]: E0217 17:15:10.292982 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2a9feb-de71-42ff-b0ae-f4697f525469" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.293005 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2a9feb-de71-42ff-b0ae-f4697f525469" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.293200 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2a9feb-de71-42ff-b0ae-f4697f525469" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.293823 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.296356 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.296678 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.300151 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.300284 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.303421 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vsfcq"] Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.385778 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.385880 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.385922 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6gjw\" (UniqueName: \"kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.487401 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.487487 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.487522 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6gjw\" (UniqueName: \"kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.492840 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.492840 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.506194 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6gjw\" (UniqueName: \"kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw\") pod \"ssh-known-hosts-edpm-deployment-vsfcq\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:10 crc kubenswrapper[4694]: I0217 17:15:10.626076 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:11 crc kubenswrapper[4694]: I0217 17:15:11.135208 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-vsfcq"] Feb 17 17:15:11 crc kubenswrapper[4694]: W0217 17:15:11.138997 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f46d450_c4d9_4b9e_bcb3_5e3ea915f59a.slice/crio-6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a WatchSource:0}: Error finding container 6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a: Status 404 returned error can't find the container with id 6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a Feb 17 17:15:11 crc kubenswrapper[4694]: I0217 17:15:11.214100 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerStarted","Data":"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185"} Feb 17 17:15:11 crc kubenswrapper[4694]: I0217 17:15:11.215407 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" event={"ID":"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a","Type":"ContainerStarted","Data":"6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a"} Feb 17 17:15:11 crc kubenswrapper[4694]: I0217 17:15:11.234565 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4dw6w" podStartSLOduration=2.832838913 podStartE2EDuration="5.234544809s" podCreationTimestamp="2026-02-17 17:15:06 +0000 UTC" firstStartedPulling="2026-02-17 17:15:08.179684269 +0000 UTC m=+1975.936759593" lastFinishedPulling="2026-02-17 17:15:10.581390165 +0000 UTC m=+1978.338465489" observedRunningTime="2026-02-17 17:15:11.229556196 +0000 UTC m=+1978.986631530" watchObservedRunningTime="2026-02-17 17:15:11.234544809 +0000 UTC m=+1978.991620143" Feb 17 17:15:12 crc kubenswrapper[4694]: I0217 17:15:12.227094 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" event={"ID":"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a","Type":"ContainerStarted","Data":"2f7fb2e7c0846f1a0778b5600201e482a28a0c36fe2c5b7b4ed4ec7d23524f06"} Feb 17 17:15:12 crc kubenswrapper[4694]: I0217 17:15:12.247354 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" podStartSLOduration=2.002022922 podStartE2EDuration="2.247333993s" podCreationTimestamp="2026-02-17 17:15:10 +0000 UTC" firstStartedPulling="2026-02-17 17:15:11.141339952 +0000 UTC m=+1978.898415276" lastFinishedPulling="2026-02-17 17:15:11.386651023 +0000 UTC m=+1979.143726347" observedRunningTime="2026-02-17 17:15:12.245349663 +0000 UTC m=+1980.002424997" watchObservedRunningTime="2026-02-17 17:15:12.247333993 +0000 UTC m=+1980.004409317" Feb 17 17:15:13 crc kubenswrapper[4694]: I0217 17:15:13.044554 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4tzgw"] Feb 17 17:15:13 crc kubenswrapper[4694]: I0217 17:15:13.055696 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4tzgw"] Feb 17 17:15:14 crc kubenswrapper[4694]: I0217 17:15:14.905028 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a2de30-5a77-4179-b166-fcc003c41c17" path="/var/lib/kubelet/pods/16a2de30-5a77-4179-b166-fcc003c41c17/volumes" Feb 17 17:15:15 crc kubenswrapper[4694]: I0217 17:15:15.032888 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kpsxt"] Feb 17 17:15:15 crc kubenswrapper[4694]: I0217 17:15:15.043701 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kpsxt"] Feb 17 17:15:16 crc kubenswrapper[4694]: I0217 17:15:16.905768 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003c20cf-819e-4d24-ba0b-a66652b8d5a3" path="/var/lib/kubelet/pods/003c20cf-819e-4d24-ba0b-a66652b8d5a3/volumes" Feb 17 17:15:17 crc kubenswrapper[4694]: I0217 17:15:17.032170 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:17 crc kubenswrapper[4694]: I0217 17:15:17.032534 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:17 crc kubenswrapper[4694]: I0217 17:15:17.103081 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:17 crc kubenswrapper[4694]: I0217 17:15:17.308174 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:17 crc kubenswrapper[4694]: I0217 17:15:17.352266 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.285554 4694 generic.go:334] "Generic (PLEG): container finished" podID="4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" containerID="2f7fb2e7c0846f1a0778b5600201e482a28a0c36fe2c5b7b4ed4ec7d23524f06" exitCode=0 Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.285649 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" event={"ID":"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a","Type":"ContainerDied","Data":"2f7fb2e7c0846f1a0778b5600201e482a28a0c36fe2c5b7b4ed4ec7d23524f06"} Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.286187 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4dw6w" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="registry-server" containerID="cri-o://b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185" gracePeriod=2 Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.724091 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.873061 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snnfj\" (UniqueName: \"kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj\") pod \"94a00290-b190-4619-a733-74a2f1db86d3\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.873156 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities\") pod \"94a00290-b190-4619-a733-74a2f1db86d3\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.873252 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content\") pod \"94a00290-b190-4619-a733-74a2f1db86d3\" (UID: \"94a00290-b190-4619-a733-74a2f1db86d3\") " Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.873965 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities" (OuterVolumeSpecName: "utilities") pod "94a00290-b190-4619-a733-74a2f1db86d3" (UID: "94a00290-b190-4619-a733-74a2f1db86d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.881233 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj" (OuterVolumeSpecName: "kube-api-access-snnfj") pod "94a00290-b190-4619-a733-74a2f1db86d3" (UID: "94a00290-b190-4619-a733-74a2f1db86d3"). InnerVolumeSpecName "kube-api-access-snnfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.899140 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a00290-b190-4619-a733-74a2f1db86d3" (UID: "94a00290-b190-4619-a733-74a2f1db86d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.975666 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snnfj\" (UniqueName: \"kubernetes.io/projected/94a00290-b190-4619-a733-74a2f1db86d3-kube-api-access-snnfj\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.975710 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:19 crc kubenswrapper[4694]: I0217 17:15:19.975720 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a00290-b190-4619-a733-74a2f1db86d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.298934 4694 generic.go:334] "Generic (PLEG): container finished" podID="94a00290-b190-4619-a733-74a2f1db86d3" containerID="b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185" exitCode=0 Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.299024 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4dw6w" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.299036 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerDied","Data":"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185"} Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.299795 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4dw6w" event={"ID":"94a00290-b190-4619-a733-74a2f1db86d3","Type":"ContainerDied","Data":"6b415dad9360164ce651fa50f73664a9036773196b96761a3ef5ab4824d3a539"} Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.299827 4694 scope.go:117] "RemoveContainer" containerID="b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.347571 4694 scope.go:117] "RemoveContainer" containerID="be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.353945 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.362286 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4dw6w"] Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.378809 4694 scope.go:117] "RemoveContainer" containerID="a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.423911 4694 scope.go:117] "RemoveContainer" containerID="b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185" Feb 17 17:15:20 crc kubenswrapper[4694]: E0217 17:15:20.424430 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185\": container with ID starting with b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185 not found: ID does not exist" containerID="b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.424461 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185"} err="failed to get container status \"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185\": rpc error: code = NotFound desc = could not find container \"b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185\": container with ID starting with b1796716823f5973a5b4b182e617780103723b2ff65ed386ae6fa8ebf6880185 not found: ID does not exist" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.424481 4694 scope.go:117] "RemoveContainer" containerID="be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07" Feb 17 17:15:20 crc kubenswrapper[4694]: E0217 17:15:20.425389 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07\": container with ID starting with be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07 not found: ID does not exist" containerID="be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.425423 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07"} err="failed to get container status \"be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07\": rpc error: code = NotFound desc = could not find container \"be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07\": container with ID starting with be1fee4966caaf308cc35fc0902843a7689b8593125df1386718161bd3111f07 not found: ID does not exist" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.425445 4694 scope.go:117] "RemoveContainer" containerID="a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24" Feb 17 17:15:20 crc kubenswrapper[4694]: E0217 17:15:20.425750 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24\": container with ID starting with a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24 not found: ID does not exist" containerID="a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.425777 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24"} err="failed to get container status \"a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24\": rpc error: code = NotFound desc = could not find container \"a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24\": container with ID starting with a72fe8167a0905a749c0bec4fdef7c52e3632e47950f40a5dfc90d62d8a49f24 not found: ID does not exist" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.692824 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.786531 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0\") pod \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.786647 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6gjw\" (UniqueName: \"kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw\") pod \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.786845 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam\") pod \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\" (UID: \"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a\") " Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.791071 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw" (OuterVolumeSpecName: "kube-api-access-t6gjw") pod "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" (UID: "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a"). InnerVolumeSpecName "kube-api-access-t6gjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.812241 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" (UID: "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.813498 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" (UID: "4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.888720 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.888768 4694 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.888787 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6gjw\" (UniqueName: \"kubernetes.io/projected/4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a-kube-api-access-t6gjw\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:20 crc kubenswrapper[4694]: I0217 17:15:20.905135 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a00290-b190-4619-a733-74a2f1db86d3" path="/var/lib/kubelet/pods/94a00290-b190-4619-a733-74a2f1db86d3/volumes" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.309913 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" event={"ID":"4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a","Type":"ContainerDied","Data":"6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a"} Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.309955 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-vsfcq" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.309965 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6deb500cee4ad6710f61e9564d1670db54c8530c2d30b1e09ffc8a2f67f41d2a" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.368297 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m"] Feb 17 17:15:21 crc kubenswrapper[4694]: E0217 17:15:21.368762 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="extract-content" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.368780 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="extract-content" Feb 17 17:15:21 crc kubenswrapper[4694]: E0217 17:15:21.368794 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" containerName="ssh-known-hosts-edpm-deployment" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.368800 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" containerName="ssh-known-hosts-edpm-deployment" Feb 17 17:15:21 crc kubenswrapper[4694]: E0217 17:15:21.368826 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="registry-server" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.368833 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="registry-server" Feb 17 17:15:21 crc kubenswrapper[4694]: E0217 17:15:21.368842 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="extract-utilities" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.368848 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="extract-utilities" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.369000 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a00290-b190-4619-a733-74a2f1db86d3" containerName="registry-server" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.369023 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a" containerName="ssh-known-hosts-edpm-deployment" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.369619 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.372040 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.381955 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m"] Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.382096 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.382334 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.382696 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.514367 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.514602 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjvbr\" (UniqueName: \"kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.514908 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.616218 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.616287 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.616357 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjvbr\" (UniqueName: \"kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.621927 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.622711 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.632471 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjvbr\" (UniqueName: \"kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wfs8m\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:21 crc kubenswrapper[4694]: I0217 17:15:21.724333 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:22 crc kubenswrapper[4694]: I0217 17:15:22.256838 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m"] Feb 17 17:15:22 crc kubenswrapper[4694]: W0217 17:15:22.262128 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ab42be_d837_4b1d_8d80_164f92fc205a.slice/crio-077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166 WatchSource:0}: Error finding container 077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166: Status 404 returned error can't find the container with id 077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166 Feb 17 17:15:22 crc kubenswrapper[4694]: I0217 17:15:22.319562 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" event={"ID":"f7ab42be-d837-4b1d-8d80-164f92fc205a","Type":"ContainerStarted","Data":"077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166"} Feb 17 17:15:23 crc kubenswrapper[4694]: I0217 17:15:23.331641 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" event={"ID":"f7ab42be-d837-4b1d-8d80-164f92fc205a","Type":"ContainerStarted","Data":"0bdb1bf67b1d3c6900f8730e294ffd179c5813d7d711a7b7932d154a5a847384"} Feb 17 17:15:23 crc kubenswrapper[4694]: I0217 17:15:23.356899 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" podStartSLOduration=2.173454627 podStartE2EDuration="2.356878746s" podCreationTimestamp="2026-02-17 17:15:21 +0000 UTC" firstStartedPulling="2026-02-17 17:15:22.265282532 +0000 UTC m=+1990.022357866" lastFinishedPulling="2026-02-17 17:15:22.448706661 +0000 UTC m=+1990.205781985" observedRunningTime="2026-02-17 17:15:23.353549764 +0000 UTC m=+1991.110625108" watchObservedRunningTime="2026-02-17 17:15:23.356878746 +0000 UTC m=+1991.113954080" Feb 17 17:15:25 crc kubenswrapper[4694]: I0217 17:15:25.957533 4694 scope.go:117] "RemoveContainer" containerID="88b29609efa228f8a413bf695867660c805fb47539ef92289ab3c9f246872be9" Feb 17 17:15:25 crc kubenswrapper[4694]: I0217 17:15:25.985034 4694 scope.go:117] "RemoveContainer" containerID="f948a22f8d37eb66368806270f57d55d41a8d2a68c66b08027eff86ec3789c89" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.049448 4694 scope.go:117] "RemoveContainer" containerID="8bed84bb390689d0473185d854ba1a7944b04c6ef7c1dd3da724bcceeba88939" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.093796 4694 scope.go:117] "RemoveContainer" containerID="e426f8ac564c7fb8d937d55a954486801a1888cee574223beb1e18d07bac18d5" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.129813 4694 scope.go:117] "RemoveContainer" containerID="de510ba0bf86b421cacbe4e0c732310ef7108bd7e3abfe754f209c8de1f23fef" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.171562 4694 scope.go:117] "RemoveContainer" containerID="1277097a7fada0aeb58b66553e375879a146eaa803facec0851bdf6bbecd8211" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.207405 4694 scope.go:117] "RemoveContainer" containerID="cfceb706fc3f55ab542e92a971751838abefc980efe7a7cd5d23cd248e20f6a2" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.228977 4694 scope.go:117] "RemoveContainer" containerID="a1e011f48b4ba8031e1e705766f1981fcec519350256b5cc05faac3f3ce1cad9" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.248717 4694 scope.go:117] "RemoveContainer" containerID="10cc4e59299d63d7e4085c9de930c7d20d7b910cde4706468651a75a3bfe68e8" Feb 17 17:15:26 crc kubenswrapper[4694]: I0217 17:15:26.270941 4694 scope.go:117] "RemoveContainer" containerID="641e40d4546df29de4a08a0e11e9bc84004564bc7210af31fd35b53da7878083" Feb 17 17:15:30 crc kubenswrapper[4694]: I0217 17:15:30.416202 4694 generic.go:334] "Generic (PLEG): container finished" podID="f7ab42be-d837-4b1d-8d80-164f92fc205a" containerID="0bdb1bf67b1d3c6900f8730e294ffd179c5813d7d711a7b7932d154a5a847384" exitCode=0 Feb 17 17:15:30 crc kubenswrapper[4694]: I0217 17:15:30.416333 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" event={"ID":"f7ab42be-d837-4b1d-8d80-164f92fc205a","Type":"ContainerDied","Data":"0bdb1bf67b1d3c6900f8730e294ffd179c5813d7d711a7b7932d154a5a847384"} Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.834022 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.911184 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam\") pod \"f7ab42be-d837-4b1d-8d80-164f92fc205a\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.911686 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory\") pod \"f7ab42be-d837-4b1d-8d80-164f92fc205a\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.911758 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjvbr\" (UniqueName: \"kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr\") pod \"f7ab42be-d837-4b1d-8d80-164f92fc205a\" (UID: \"f7ab42be-d837-4b1d-8d80-164f92fc205a\") " Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.917070 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr" (OuterVolumeSpecName: "kube-api-access-fjvbr") pod "f7ab42be-d837-4b1d-8d80-164f92fc205a" (UID: "f7ab42be-d837-4b1d-8d80-164f92fc205a"). InnerVolumeSpecName "kube-api-access-fjvbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.939337 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7ab42be-d837-4b1d-8d80-164f92fc205a" (UID: "f7ab42be-d837-4b1d-8d80-164f92fc205a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:31 crc kubenswrapper[4694]: I0217 17:15:31.941418 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory" (OuterVolumeSpecName: "inventory") pod "f7ab42be-d837-4b1d-8d80-164f92fc205a" (UID: "f7ab42be-d837-4b1d-8d80-164f92fc205a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.016493 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.016526 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjvbr\" (UniqueName: \"kubernetes.io/projected/f7ab42be-d837-4b1d-8d80-164f92fc205a-kube-api-access-fjvbr\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.016536 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab42be-d837-4b1d-8d80-164f92fc205a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.433079 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" event={"ID":"f7ab42be-d837-4b1d-8d80-164f92fc205a","Type":"ContainerDied","Data":"077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166"} Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.433116 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="077d33e35ff02548bf9fe79ed0be2737d3109f6f2a8bb7b3307a7403064d4166" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.433164 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wfs8m" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.518517 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s"] Feb 17 17:15:32 crc kubenswrapper[4694]: E0217 17:15:32.518925 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab42be-d837-4b1d-8d80-164f92fc205a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.518940 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab42be-d837-4b1d-8d80-164f92fc205a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.519174 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab42be-d837-4b1d-8d80-164f92fc205a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.519808 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.524257 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.524710 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.525106 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.525432 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.535006 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s"] Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.625002 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2x2c\" (UniqueName: \"kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.625066 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.625177 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.726976 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.727080 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.727179 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2x2c\" (UniqueName: \"kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.733247 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.738036 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.746313 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2x2c\" (UniqueName: \"kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:32 crc kubenswrapper[4694]: I0217 17:15:32.849360 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:33 crc kubenswrapper[4694]: I0217 17:15:33.336600 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s"] Feb 17 17:15:33 crc kubenswrapper[4694]: I0217 17:15:33.443881 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" event={"ID":"fe2671a8-04cd-4b09-ba6a-e6250762985e","Type":"ContainerStarted","Data":"f264ed5fa782396a03350937c061ed8169460c9db3df965eb15665c82e1cb102"} Feb 17 17:15:34 crc kubenswrapper[4694]: I0217 17:15:34.452595 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" event={"ID":"fe2671a8-04cd-4b09-ba6a-e6250762985e","Type":"ContainerStarted","Data":"29142a911cd78717baa9e263d0b5b1f4b4fe664218ab3c12e0d70e44b4ef74ba"} Feb 17 17:15:34 crc kubenswrapper[4694]: I0217 17:15:34.473025 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" podStartSLOduration=2.275784744 podStartE2EDuration="2.473004114s" podCreationTimestamp="2026-02-17 17:15:32 +0000 UTC" firstStartedPulling="2026-02-17 17:15:33.33695063 +0000 UTC m=+2001.094025954" lastFinishedPulling="2026-02-17 17:15:33.53417 +0000 UTC m=+2001.291245324" observedRunningTime="2026-02-17 17:15:34.465455827 +0000 UTC m=+2002.222531151" watchObservedRunningTime="2026-02-17 17:15:34.473004114 +0000 UTC m=+2002.230079438" Feb 17 17:15:42 crc kubenswrapper[4694]: I0217 17:15:42.526677 4694 generic.go:334] "Generic (PLEG): container finished" podID="fe2671a8-04cd-4b09-ba6a-e6250762985e" containerID="29142a911cd78717baa9e263d0b5b1f4b4fe664218ab3c12e0d70e44b4ef74ba" exitCode=0 Feb 17 17:15:42 crc kubenswrapper[4694]: I0217 17:15:42.526723 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" event={"ID":"fe2671a8-04cd-4b09-ba6a-e6250762985e","Type":"ContainerDied","Data":"29142a911cd78717baa9e263d0b5b1f4b4fe664218ab3c12e0d70e44b4ef74ba"} Feb 17 17:15:43 crc kubenswrapper[4694]: I0217 17:15:43.934963 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.124230 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory\") pod \"fe2671a8-04cd-4b09-ba6a-e6250762985e\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.126070 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam\") pod \"fe2671a8-04cd-4b09-ba6a-e6250762985e\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.126138 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2x2c\" (UniqueName: \"kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c\") pod \"fe2671a8-04cd-4b09-ba6a-e6250762985e\" (UID: \"fe2671a8-04cd-4b09-ba6a-e6250762985e\") " Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.129800 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c" (OuterVolumeSpecName: "kube-api-access-t2x2c") pod "fe2671a8-04cd-4b09-ba6a-e6250762985e" (UID: "fe2671a8-04cd-4b09-ba6a-e6250762985e"). InnerVolumeSpecName "kube-api-access-t2x2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.150328 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe2671a8-04cd-4b09-ba6a-e6250762985e" (UID: "fe2671a8-04cd-4b09-ba6a-e6250762985e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.150820 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory" (OuterVolumeSpecName: "inventory") pod "fe2671a8-04cd-4b09-ba6a-e6250762985e" (UID: "fe2671a8-04cd-4b09-ba6a-e6250762985e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.232213 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.232500 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2x2c\" (UniqueName: \"kubernetes.io/projected/fe2671a8-04cd-4b09-ba6a-e6250762985e-kube-api-access-t2x2c\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.232514 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe2671a8-04cd-4b09-ba6a-e6250762985e-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.545227 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" event={"ID":"fe2671a8-04cd-4b09-ba6a-e6250762985e","Type":"ContainerDied","Data":"f264ed5fa782396a03350937c061ed8169460c9db3df965eb15665c82e1cb102"} Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.545270 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f264ed5fa782396a03350937c061ed8169460c9db3df965eb15665c82e1cb102" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.545330 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.617872 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.617927 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.620777 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5"] Feb 17 17:15:44 crc kubenswrapper[4694]: E0217 17:15:44.621135 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2671a8-04cd-4b09-ba6a-e6250762985e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.621152 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2671a8-04cd-4b09-ba6a-e6250762985e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.621360 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2671a8-04cd-4b09-ba6a-e6250762985e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.621984 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.624790 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.625293 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.625744 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.625899 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.628094 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.628497 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.630812 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.636823 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.638833 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5"] Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740747 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740842 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brln8\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740878 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740894 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740920 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740957 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.740997 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741023 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741066 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741093 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741111 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741176 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741194 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.741252 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843158 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843209 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843262 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843299 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843343 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843374 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843424 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brln8\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843448 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843464 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843488 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843510 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843530 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843551 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.843573 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.848298 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.848347 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.848973 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.848984 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.849210 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.849269 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.849964 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.850110 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.850572 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.850649 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.851189 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.851629 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.852953 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.864274 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brln8\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:44 crc kubenswrapper[4694]: I0217 17:15:44.939921 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:15:45 crc kubenswrapper[4694]: I0217 17:15:45.455363 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5"] Feb 17 17:15:45 crc kubenswrapper[4694]: I0217 17:15:45.560222 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" event={"ID":"fc2a86d9-61f8-4af3-9835-2aeea9736b84","Type":"ContainerStarted","Data":"8147be982f2f2393bf62dc472d91f326f9601244ba6b0571e4d5725224f85696"} Feb 17 17:15:46 crc kubenswrapper[4694]: I0217 17:15:46.569076 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" event={"ID":"fc2a86d9-61f8-4af3-9835-2aeea9736b84","Type":"ContainerStarted","Data":"e1d87f003f34d04c9d67dd7bafac45e93ca044bb45b142d2c6fdbea92214d0e6"} Feb 17 17:15:46 crc kubenswrapper[4694]: I0217 17:15:46.598179 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" podStartSLOduration=2.424573587 podStartE2EDuration="2.598163453s" podCreationTimestamp="2026-02-17 17:15:44 +0000 UTC" firstStartedPulling="2026-02-17 17:15:45.46380828 +0000 UTC m=+2013.220883604" lastFinishedPulling="2026-02-17 17:15:45.637398146 +0000 UTC m=+2013.394473470" observedRunningTime="2026-02-17 17:15:46.584506285 +0000 UTC m=+2014.341581609" watchObservedRunningTime="2026-02-17 17:15:46.598163453 +0000 UTC m=+2014.355238777" Feb 17 17:15:58 crc kubenswrapper[4694]: I0217 17:15:58.053794 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-n6wlz"] Feb 17 17:15:58 crc kubenswrapper[4694]: I0217 17:15:58.063624 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-n6wlz"] Feb 17 17:15:58 crc kubenswrapper[4694]: I0217 17:15:58.905677 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0984ad1-baf2-4fc2-890e-1bd93b726913" path="/var/lib/kubelet/pods/e0984ad1-baf2-4fc2-890e-1bd93b726913/volumes" Feb 17 17:16:14 crc kubenswrapper[4694]: I0217 17:16:14.617868 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:16:14 crc kubenswrapper[4694]: I0217 17:16:14.618525 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.881218 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.885473 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.893308 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.908236 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntwj\" (UniqueName: \"kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.908333 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:21 crc kubenswrapper[4694]: I0217 17:16:21.908419 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.010060 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.010175 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.010298 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntwj\" (UniqueName: \"kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.010823 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.011101 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.032942 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntwj\" (UniqueName: \"kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj\") pod \"redhat-operators-jfwh8\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.129601 4694 generic.go:334] "Generic (PLEG): container finished" podID="fc2a86d9-61f8-4af3-9835-2aeea9736b84" containerID="e1d87f003f34d04c9d67dd7bafac45e93ca044bb45b142d2c6fdbea92214d0e6" exitCode=0 Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.129695 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" event={"ID":"fc2a86d9-61f8-4af3-9835-2aeea9736b84","Type":"ContainerDied","Data":"e1d87f003f34d04c9d67dd7bafac45e93ca044bb45b142d2c6fdbea92214d0e6"} Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.213283 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:22 crc kubenswrapper[4694]: I0217 17:16:22.672506 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.139715 4694 generic.go:334] "Generic (PLEG): container finished" podID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerID="42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090" exitCode=0 Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.139779 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerDied","Data":"42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090"} Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.139837 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerStarted","Data":"e736461b0b405561be9ca5960afe74affc4296e7092e01095e5723440652ebd9"} Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.141787 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.731846 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841343 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841390 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841420 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841447 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841484 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841510 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841587 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841627 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841668 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841735 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brln8\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841788 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841827 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841872 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.841910 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam\") pod \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\" (UID: \"fc2a86d9-61f8-4af3-9835-2aeea9736b84\") " Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.849987 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.850993 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.851061 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.851851 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.851920 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8" (OuterVolumeSpecName: "kube-api-access-brln8") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "kube-api-access-brln8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.852300 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.852808 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.853374 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.853823 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.855459 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.855560 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.856877 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.883754 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.885786 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory" (OuterVolumeSpecName: "inventory") pod "fc2a86d9-61f8-4af3-9835-2aeea9736b84" (UID: "fc2a86d9-61f8-4af3-9835-2aeea9736b84"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944263 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944400 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944714 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944757 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944774 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944790 4694 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944804 4694 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944817 4694 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944831 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944846 4694 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944859 4694 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944872 4694 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944888 4694 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc2a86d9-61f8-4af3-9835-2aeea9736b84-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:23 crc kubenswrapper[4694]: I0217 17:16:23.944902 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brln8\" (UniqueName: \"kubernetes.io/projected/fc2a86d9-61f8-4af3-9835-2aeea9736b84-kube-api-access-brln8\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.148205 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" event={"ID":"fc2a86d9-61f8-4af3-9835-2aeea9736b84","Type":"ContainerDied","Data":"8147be982f2f2393bf62dc472d91f326f9601244ba6b0571e4d5725224f85696"} Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.148247 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8147be982f2f2393bf62dc472d91f326f9601244ba6b0571e4d5725224f85696" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.149381 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.243094 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25"] Feb 17 17:16:24 crc kubenswrapper[4694]: E0217 17:16:24.243587 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2a86d9-61f8-4af3-9835-2aeea9736b84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.243623 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2a86d9-61f8-4af3-9835-2aeea9736b84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.243861 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2a86d9-61f8-4af3-9835-2aeea9736b84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.244641 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.246789 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.247021 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.247150 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.249798 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.249864 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jmj\" (UniqueName: \"kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.250000 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.250162 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.250217 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.251083 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.253112 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25"] Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.258176 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.351627 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.351721 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jmj\" (UniqueName: \"kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.351828 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.351972 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.352033 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.352541 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.356436 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.357196 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.357547 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.370016 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jmj\" (UniqueName: \"kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-rwf25\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:24 crc kubenswrapper[4694]: I0217 17:16:24.562949 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:16:25 crc kubenswrapper[4694]: I0217 17:16:25.138066 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25"] Feb 17 17:16:25 crc kubenswrapper[4694]: W0217 17:16:25.139528 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d24b5a_8b19_4532_a8ca_b34ebad591d1.slice/crio-74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c WatchSource:0}: Error finding container 74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c: Status 404 returned error can't find the container with id 74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c Feb 17 17:16:25 crc kubenswrapper[4694]: I0217 17:16:25.158584 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerStarted","Data":"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212"} Feb 17 17:16:25 crc kubenswrapper[4694]: I0217 17:16:25.161892 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" event={"ID":"a7d24b5a-8b19-4532-a8ca-b34ebad591d1","Type":"ContainerStarted","Data":"74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c"} Feb 17 17:16:26 crc kubenswrapper[4694]: I0217 17:16:26.171821 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" event={"ID":"a7d24b5a-8b19-4532-a8ca-b34ebad591d1","Type":"ContainerStarted","Data":"23d70fd399a3233efa1331ff1f387808aaac39ffd557faa4109d500d44ed6ec1"} Feb 17 17:16:26 crc kubenswrapper[4694]: I0217 17:16:26.176975 4694 generic.go:334] "Generic (PLEG): container finished" podID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerID="3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212" exitCode=0 Feb 17 17:16:26 crc kubenswrapper[4694]: I0217 17:16:26.177033 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerDied","Data":"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212"} Feb 17 17:16:26 crc kubenswrapper[4694]: I0217 17:16:26.193052 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" podStartSLOduration=1.925823498 podStartE2EDuration="2.193029091s" podCreationTimestamp="2026-02-17 17:16:24 +0000 UTC" firstStartedPulling="2026-02-17 17:16:25.141837286 +0000 UTC m=+2052.898912600" lastFinishedPulling="2026-02-17 17:16:25.409042859 +0000 UTC m=+2053.166118193" observedRunningTime="2026-02-17 17:16:26.188926589 +0000 UTC m=+2053.946001943" watchObservedRunningTime="2026-02-17 17:16:26.193029091 +0000 UTC m=+2053.950104415" Feb 17 17:16:26 crc kubenswrapper[4694]: I0217 17:16:26.462908 4694 scope.go:117] "RemoveContainer" containerID="52aba373abcffee22cf8260d1bd960ebba0cd09e3d539084d88680beccfb94d4" Feb 17 17:16:28 crc kubenswrapper[4694]: I0217 17:16:28.198444 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerStarted","Data":"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5"} Feb 17 17:16:28 crc kubenswrapper[4694]: I0217 17:16:28.233848 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jfwh8" podStartSLOduration=3.103726175 podStartE2EDuration="7.233823955s" podCreationTimestamp="2026-02-17 17:16:21 +0000 UTC" firstStartedPulling="2026-02-17 17:16:23.141487803 +0000 UTC m=+2050.898563127" lastFinishedPulling="2026-02-17 17:16:27.271585573 +0000 UTC m=+2055.028660907" observedRunningTime="2026-02-17 17:16:28.227665663 +0000 UTC m=+2055.984741007" watchObservedRunningTime="2026-02-17 17:16:28.233823955 +0000 UTC m=+2055.990899279" Feb 17 17:16:32 crc kubenswrapper[4694]: I0217 17:16:32.214474 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:32 crc kubenswrapper[4694]: I0217 17:16:32.215039 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:33 crc kubenswrapper[4694]: I0217 17:16:33.260623 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jfwh8" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="registry-server" probeResult="failure" output=< Feb 17 17:16:33 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 17:16:33 crc kubenswrapper[4694]: > Feb 17 17:16:42 crc kubenswrapper[4694]: I0217 17:16:42.269791 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:42 crc kubenswrapper[4694]: I0217 17:16:42.321383 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:42 crc kubenswrapper[4694]: I0217 17:16:42.509632 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.317814 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jfwh8" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="registry-server" containerID="cri-o://c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5" gracePeriod=2 Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.762579 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.845430 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities\") pod \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.845674 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntwj\" (UniqueName: \"kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj\") pod \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.845785 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content\") pod \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\" (UID: \"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14\") " Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.846414 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities" (OuterVolumeSpecName: "utilities") pod "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" (UID: "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.851318 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj" (OuterVolumeSpecName: "kube-api-access-xntwj") pod "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" (UID: "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14"). InnerVolumeSpecName "kube-api-access-xntwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.947470 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntwj\" (UniqueName: \"kubernetes.io/projected/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-kube-api-access-xntwj\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.947501 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:43 crc kubenswrapper[4694]: I0217 17:16:43.969338 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" (UID: "03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.048903 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.330302 4694 generic.go:334] "Generic (PLEG): container finished" podID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerID="c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5" exitCode=0 Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.330378 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfwh8" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.330395 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerDied","Data":"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5"} Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.330774 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfwh8" event={"ID":"03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14","Type":"ContainerDied","Data":"e736461b0b405561be9ca5960afe74affc4296e7092e01095e5723440652ebd9"} Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.330796 4694 scope.go:117] "RemoveContainer" containerID="c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.365151 4694 scope.go:117] "RemoveContainer" containerID="3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.370362 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.380169 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jfwh8"] Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.386623 4694 scope.go:117] "RemoveContainer" containerID="42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.426476 4694 scope.go:117] "RemoveContainer" containerID="c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5" Feb 17 17:16:44 crc kubenswrapper[4694]: E0217 17:16:44.426885 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5\": container with ID starting with c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5 not found: ID does not exist" containerID="c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.426918 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5"} err="failed to get container status \"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5\": rpc error: code = NotFound desc = could not find container \"c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5\": container with ID starting with c60b4eb9c7feba598fca27d6856cfdad37ad8d5470844907b105a68da8f47bf5 not found: ID does not exist" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.426937 4694 scope.go:117] "RemoveContainer" containerID="3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212" Feb 17 17:16:44 crc kubenswrapper[4694]: E0217 17:16:44.427144 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212\": container with ID starting with 3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212 not found: ID does not exist" containerID="3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.427167 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212"} err="failed to get container status \"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212\": rpc error: code = NotFound desc = could not find container \"3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212\": container with ID starting with 3cdde09a4aeaa1eb27dfe8ee6a6235287901e4c7849ea23eb4a07ac47162c212 not found: ID does not exist" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.427182 4694 scope.go:117] "RemoveContainer" containerID="42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090" Feb 17 17:16:44 crc kubenswrapper[4694]: E0217 17:16:44.427357 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090\": container with ID starting with 42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090 not found: ID does not exist" containerID="42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.427376 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090"} err="failed to get container status \"42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090\": rpc error: code = NotFound desc = could not find container \"42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090\": container with ID starting with 42d13cc9497dc57eacf33f3792f53449dce6d4a5b590e2bcecd76307aa11f090 not found: ID does not exist" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.618083 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.618426 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.618542 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.619355 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.619510 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02" gracePeriod=600 Feb 17 17:16:44 crc kubenswrapper[4694]: I0217 17:16:44.906799 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" path="/var/lib/kubelet/pods/03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14/volumes" Feb 17 17:16:45 crc kubenswrapper[4694]: I0217 17:16:45.343434 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02" exitCode=0 Feb 17 17:16:45 crc kubenswrapper[4694]: I0217 17:16:45.343496 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02"} Feb 17 17:16:45 crc kubenswrapper[4694]: I0217 17:16:45.343778 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f"} Feb 17 17:16:45 crc kubenswrapper[4694]: I0217 17:16:45.343801 4694 scope.go:117] "RemoveContainer" containerID="34c8b79aa42874b01981f4eb404c2dc58dbad18b4312be71a6b5124c95fb1a2f" Feb 17 17:17:24 crc kubenswrapper[4694]: I0217 17:17:24.676848 4694 generic.go:334] "Generic (PLEG): container finished" podID="a7d24b5a-8b19-4532-a8ca-b34ebad591d1" containerID="23d70fd399a3233efa1331ff1f387808aaac39ffd557faa4109d500d44ed6ec1" exitCode=0 Feb 17 17:17:24 crc kubenswrapper[4694]: I0217 17:17:24.676902 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" event={"ID":"a7d24b5a-8b19-4532-a8ca-b34ebad591d1","Type":"ContainerDied","Data":"23d70fd399a3233efa1331ff1f387808aaac39ffd557faa4109d500d44ed6ec1"} Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.070498 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.252996 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam\") pod \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.253362 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle\") pod \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.253506 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0\") pod \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.253710 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory\") pod \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.253816 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6jmj\" (UniqueName: \"kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj\") pod \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\" (UID: \"a7d24b5a-8b19-4532-a8ca-b34ebad591d1\") " Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.258541 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj" (OuterVolumeSpecName: "kube-api-access-t6jmj") pod "a7d24b5a-8b19-4532-a8ca-b34ebad591d1" (UID: "a7d24b5a-8b19-4532-a8ca-b34ebad591d1"). InnerVolumeSpecName "kube-api-access-t6jmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.261513 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a7d24b5a-8b19-4532-a8ca-b34ebad591d1" (UID: "a7d24b5a-8b19-4532-a8ca-b34ebad591d1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.277027 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a7d24b5a-8b19-4532-a8ca-b34ebad591d1" (UID: "a7d24b5a-8b19-4532-a8ca-b34ebad591d1"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.280432 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory" (OuterVolumeSpecName: "inventory") pod "a7d24b5a-8b19-4532-a8ca-b34ebad591d1" (UID: "a7d24b5a-8b19-4532-a8ca-b34ebad591d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.283622 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a7d24b5a-8b19-4532-a8ca-b34ebad591d1" (UID: "a7d24b5a-8b19-4532-a8ca-b34ebad591d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.355977 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.356013 4694 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.356024 4694 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.356035 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.356044 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6jmj\" (UniqueName: \"kubernetes.io/projected/a7d24b5a-8b19-4532-a8ca-b34ebad591d1-kube-api-access-t6jmj\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.696876 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" event={"ID":"a7d24b5a-8b19-4532-a8ca-b34ebad591d1","Type":"ContainerDied","Data":"74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c"} Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.696956 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74cc0eea99b95f2e38de43770a13450d8b2f6d6bbb2486a501f40e3daaa8b54c" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.697017 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-rwf25" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.796745 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4"] Feb 17 17:17:26 crc kubenswrapper[4694]: E0217 17:17:26.797344 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="registry-server" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.797416 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="registry-server" Feb 17 17:17:26 crc kubenswrapper[4694]: E0217 17:17:26.797476 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="extract-utilities" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.797551 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="extract-utilities" Feb 17 17:17:26 crc kubenswrapper[4694]: E0217 17:17:26.797657 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d24b5a-8b19-4532-a8ca-b34ebad591d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.797713 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d24b5a-8b19-4532-a8ca-b34ebad591d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 17:17:26 crc kubenswrapper[4694]: E0217 17:17:26.797792 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="extract-content" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.797851 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="extract-content" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.798051 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d24b5a-8b19-4532-a8ca-b34ebad591d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.798116 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ba9c4e-5cf3-4bff-b808-ffbb7a82fb14" containerName="registry-server" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.798764 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.800488 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.801050 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.801261 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.801389 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.801515 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.802080 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.879326 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4"] Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967178 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967414 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967524 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967640 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtqq\" (UniqueName: \"kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967756 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:26 crc kubenswrapper[4694]: I0217 17:17:26.967859 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.069945 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.070249 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.070363 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.070489 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtqq\" (UniqueName: \"kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.070672 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.070814 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.074518 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.074678 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.075329 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.075859 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.076015 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.090912 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtqq\" (UniqueName: \"kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.118771 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.654132 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4"] Feb 17 17:17:27 crc kubenswrapper[4694]: I0217 17:17:27.707176 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" event={"ID":"55da2ff6-efab-4cee-acb9-b0a04edc8980","Type":"ContainerStarted","Data":"c1939f741648549720b015dd01786761eca86288434966bddcc79ec5f9f0d50a"} Feb 17 17:17:28 crc kubenswrapper[4694]: I0217 17:17:28.721167 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" event={"ID":"55da2ff6-efab-4cee-acb9-b0a04edc8980","Type":"ContainerStarted","Data":"0aafcb4ca274d1429a1df9f9e002755a4fdd80cc823bb9079e0ae5ee4b3d056f"} Feb 17 17:17:28 crc kubenswrapper[4694]: I0217 17:17:28.750513 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" podStartSLOduration=2.5885037029999998 podStartE2EDuration="2.750490273s" podCreationTimestamp="2026-02-17 17:17:26 +0000 UTC" firstStartedPulling="2026-02-17 17:17:27.657844461 +0000 UTC m=+2115.414919795" lastFinishedPulling="2026-02-17 17:17:27.819831031 +0000 UTC m=+2115.576906365" observedRunningTime="2026-02-17 17:17:28.749672233 +0000 UTC m=+2116.506747597" watchObservedRunningTime="2026-02-17 17:17:28.750490273 +0000 UTC m=+2116.507565607" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.392521 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.394940 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.411171 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.464673 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.464775 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.464829 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psv9x\" (UniqueName: \"kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.566362 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.566478 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.566540 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psv9x\" (UniqueName: \"kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.567479 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.567492 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.588140 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psv9x\" (UniqueName: \"kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x\") pod \"community-operators-47s4j\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:36 crc kubenswrapper[4694]: I0217 17:17:36.769024 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:37 crc kubenswrapper[4694]: I0217 17:17:37.340897 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:37 crc kubenswrapper[4694]: W0217 17:17:37.357907 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde468612_cbe3_4083_99f6_a4eb8933c5bd.slice/crio-9be5c7d349e0a0317f1590af73423c7425aa65052f3ec842c78d6b8007aa4f3f WatchSource:0}: Error finding container 9be5c7d349e0a0317f1590af73423c7425aa65052f3ec842c78d6b8007aa4f3f: Status 404 returned error can't find the container with id 9be5c7d349e0a0317f1590af73423c7425aa65052f3ec842c78d6b8007aa4f3f Feb 17 17:17:37 crc kubenswrapper[4694]: I0217 17:17:37.799458 4694 generic.go:334] "Generic (PLEG): container finished" podID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerID="50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034" exitCode=0 Feb 17 17:17:37 crc kubenswrapper[4694]: I0217 17:17:37.799563 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerDied","Data":"50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034"} Feb 17 17:17:37 crc kubenswrapper[4694]: I0217 17:17:37.799726 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerStarted","Data":"9be5c7d349e0a0317f1590af73423c7425aa65052f3ec842c78d6b8007aa4f3f"} Feb 17 17:17:39 crc kubenswrapper[4694]: I0217 17:17:39.821629 4694 generic.go:334] "Generic (PLEG): container finished" podID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerID="fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8" exitCode=0 Feb 17 17:17:39 crc kubenswrapper[4694]: I0217 17:17:39.821673 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerDied","Data":"fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8"} Feb 17 17:17:40 crc kubenswrapper[4694]: I0217 17:17:40.841565 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerStarted","Data":"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e"} Feb 17 17:17:40 crc kubenswrapper[4694]: I0217 17:17:40.865354 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47s4j" podStartSLOduration=2.399578358 podStartE2EDuration="4.865335648s" podCreationTimestamp="2026-02-17 17:17:36 +0000 UTC" firstStartedPulling="2026-02-17 17:17:37.802419353 +0000 UTC m=+2125.559494687" lastFinishedPulling="2026-02-17 17:17:40.268176643 +0000 UTC m=+2128.025251977" observedRunningTime="2026-02-17 17:17:40.861121953 +0000 UTC m=+2128.618197287" watchObservedRunningTime="2026-02-17 17:17:40.865335648 +0000 UTC m=+2128.622410972" Feb 17 17:17:46 crc kubenswrapper[4694]: I0217 17:17:46.769361 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:46 crc kubenswrapper[4694]: I0217 17:17:46.769933 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:46 crc kubenswrapper[4694]: I0217 17:17:46.821939 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:46 crc kubenswrapper[4694]: I0217 17:17:46.931483 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:47 crc kubenswrapper[4694]: I0217 17:17:47.061538 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:48 crc kubenswrapper[4694]: I0217 17:17:48.903239 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47s4j" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="registry-server" containerID="cri-o://49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e" gracePeriod=2 Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.366062 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.504515 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities\") pod \"de468612-cbe3-4083-99f6-a4eb8933c5bd\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.504570 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content\") pod \"de468612-cbe3-4083-99f6-a4eb8933c5bd\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.504755 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psv9x\" (UniqueName: \"kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x\") pod \"de468612-cbe3-4083-99f6-a4eb8933c5bd\" (UID: \"de468612-cbe3-4083-99f6-a4eb8933c5bd\") " Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.505956 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities" (OuterVolumeSpecName: "utilities") pod "de468612-cbe3-4083-99f6-a4eb8933c5bd" (UID: "de468612-cbe3-4083-99f6-a4eb8933c5bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.510851 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x" (OuterVolumeSpecName: "kube-api-access-psv9x") pod "de468612-cbe3-4083-99f6-a4eb8933c5bd" (UID: "de468612-cbe3-4083-99f6-a4eb8933c5bd"). InnerVolumeSpecName "kube-api-access-psv9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.571537 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de468612-cbe3-4083-99f6-a4eb8933c5bd" (UID: "de468612-cbe3-4083-99f6-a4eb8933c5bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.606716 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psv9x\" (UniqueName: \"kubernetes.io/projected/de468612-cbe3-4083-99f6-a4eb8933c5bd-kube-api-access-psv9x\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.606759 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.606772 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de468612-cbe3-4083-99f6-a4eb8933c5bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.912133 4694 generic.go:334] "Generic (PLEG): container finished" podID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerID="49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e" exitCode=0 Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.912188 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47s4j" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.912179 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerDied","Data":"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e"} Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.912485 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47s4j" event={"ID":"de468612-cbe3-4083-99f6-a4eb8933c5bd","Type":"ContainerDied","Data":"9be5c7d349e0a0317f1590af73423c7425aa65052f3ec842c78d6b8007aa4f3f"} Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.912526 4694 scope.go:117] "RemoveContainer" containerID="49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.952768 4694 scope.go:117] "RemoveContainer" containerID="fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8" Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.958673 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.989729 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47s4j"] Feb 17 17:17:49 crc kubenswrapper[4694]: I0217 17:17:49.989862 4694 scope.go:117] "RemoveContainer" containerID="50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.017115 4694 scope.go:117] "RemoveContainer" containerID="49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e" Feb 17 17:17:50 crc kubenswrapper[4694]: E0217 17:17:50.018162 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e\": container with ID starting with 49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e not found: ID does not exist" containerID="49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.018200 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e"} err="failed to get container status \"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e\": rpc error: code = NotFound desc = could not find container \"49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e\": container with ID starting with 49b50423711e01be1fa3dc910fd4f0ef6dc77bd865c11b2fc01ffea15b61413e not found: ID does not exist" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.018227 4694 scope.go:117] "RemoveContainer" containerID="fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8" Feb 17 17:17:50 crc kubenswrapper[4694]: E0217 17:17:50.018712 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8\": container with ID starting with fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8 not found: ID does not exist" containerID="fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.018736 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8"} err="failed to get container status \"fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8\": rpc error: code = NotFound desc = could not find container \"fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8\": container with ID starting with fe0b0fae7b43c2aab6de3f655d5f85da2046c70c61788d6ab440d6eb55b1ecf8 not found: ID does not exist" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.018750 4694 scope.go:117] "RemoveContainer" containerID="50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034" Feb 17 17:17:50 crc kubenswrapper[4694]: E0217 17:17:50.022143 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034\": container with ID starting with 50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034 not found: ID does not exist" containerID="50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.022175 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034"} err="failed to get container status \"50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034\": rpc error: code = NotFound desc = could not find container \"50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034\": container with ID starting with 50cb70e4c470be1b1cc29b23d3da7ce590e51a6a5312c25d088a1edfe57b8034 not found: ID does not exist" Feb 17 17:17:50 crc kubenswrapper[4694]: I0217 17:17:50.907744 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" path="/var/lib/kubelet/pods/de468612-cbe3-4083-99f6-a4eb8933c5bd/volumes" Feb 17 17:18:12 crc kubenswrapper[4694]: I0217 17:18:12.092189 4694 generic.go:334] "Generic (PLEG): container finished" podID="55da2ff6-efab-4cee-acb9-b0a04edc8980" containerID="0aafcb4ca274d1429a1df9f9e002755a4fdd80cc823bb9079e0ae5ee4b3d056f" exitCode=0 Feb 17 17:18:12 crc kubenswrapper[4694]: I0217 17:18:12.092499 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" event={"ID":"55da2ff6-efab-4cee-acb9-b0a04edc8980","Type":"ContainerDied","Data":"0aafcb4ca274d1429a1df9f9e002755a4fdd80cc823bb9079e0ae5ee4b3d056f"} Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.531155 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654372 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654453 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654498 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrtqq\" (UniqueName: \"kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654682 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654753 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.654781 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0\") pod \"55da2ff6-efab-4cee-acb9-b0a04edc8980\" (UID: \"55da2ff6-efab-4cee-acb9-b0a04edc8980\") " Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.662592 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq" (OuterVolumeSpecName: "kube-api-access-xrtqq") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "kube-api-access-xrtqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.662764 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.684298 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory" (OuterVolumeSpecName: "inventory") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.684908 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.685185 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.698307 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55da2ff6-efab-4cee-acb9-b0a04edc8980" (UID: "55da2ff6-efab-4cee-acb9-b0a04edc8980"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757551 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757617 4694 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757637 4694 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757653 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757665 4694 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55da2ff6-efab-4cee-acb9-b0a04edc8980-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:13 crc kubenswrapper[4694]: I0217 17:18:13.757678 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrtqq\" (UniqueName: \"kubernetes.io/projected/55da2ff6-efab-4cee-acb9-b0a04edc8980-kube-api-access-xrtqq\") on node \"crc\" DevicePath \"\"" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.115517 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" event={"ID":"55da2ff6-efab-4cee-acb9-b0a04edc8980","Type":"ContainerDied","Data":"c1939f741648549720b015dd01786761eca86288434966bddcc79ec5f9f0d50a"} Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.115587 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1939f741648549720b015dd01786761eca86288434966bddcc79ec5f9f0d50a" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.115672 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.205783 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh"] Feb 17 17:18:14 crc kubenswrapper[4694]: E0217 17:18:14.206278 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="extract-content" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206301 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="extract-content" Feb 17 17:18:14 crc kubenswrapper[4694]: E0217 17:18:14.206317 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55da2ff6-efab-4cee-acb9-b0a04edc8980" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206327 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="55da2ff6-efab-4cee-acb9-b0a04edc8980" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 17:18:14 crc kubenswrapper[4694]: E0217 17:18:14.206346 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="registry-server" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206351 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="registry-server" Feb 17 17:18:14 crc kubenswrapper[4694]: E0217 17:18:14.206365 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="extract-utilities" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206371 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="extract-utilities" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206549 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="de468612-cbe3-4083-99f6-a4eb8933c5bd" containerName="registry-server" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.206563 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="55da2ff6-efab-4cee-acb9-b0a04edc8980" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.207243 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.211621 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.211851 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.212066 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.213025 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.213883 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh"] Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.224094 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.368426 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.368511 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.368561 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.368580 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.368959 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjmxt\" (UniqueName: \"kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.470982 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.471135 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.471177 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.471359 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjmxt\" (UniqueName: \"kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.471415 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.475062 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.475334 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.475563 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.475791 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.489013 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjmxt\" (UniqueName: \"kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:14 crc kubenswrapper[4694]: I0217 17:18:14.535589 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:18:15 crc kubenswrapper[4694]: I0217 17:18:15.095248 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh"] Feb 17 17:18:15 crc kubenswrapper[4694]: W0217 17:18:15.097622 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032fed18_d394_4743_ac9d_efa8d472bbc2.slice/crio-2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68 WatchSource:0}: Error finding container 2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68: Status 404 returned error can't find the container with id 2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68 Feb 17 17:18:15 crc kubenswrapper[4694]: I0217 17:18:15.125368 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" event={"ID":"032fed18-d394-4743-ac9d-efa8d472bbc2","Type":"ContainerStarted","Data":"2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68"} Feb 17 17:18:16 crc kubenswrapper[4694]: I0217 17:18:16.136792 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" event={"ID":"032fed18-d394-4743-ac9d-efa8d472bbc2","Type":"ContainerStarted","Data":"a89284264d364cc70967f6946eb287babf4b155f578491c74bafc98a8419481a"} Feb 17 17:18:16 crc kubenswrapper[4694]: I0217 17:18:16.179044 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" podStartSLOduration=1.962119927 podStartE2EDuration="2.179017602s" podCreationTimestamp="2026-02-17 17:18:14 +0000 UTC" firstStartedPulling="2026-02-17 17:18:15.100244586 +0000 UTC m=+2162.857319910" lastFinishedPulling="2026-02-17 17:18:15.317142261 +0000 UTC m=+2163.074217585" observedRunningTime="2026-02-17 17:18:16.161215829 +0000 UTC m=+2163.918291163" watchObservedRunningTime="2026-02-17 17:18:16.179017602 +0000 UTC m=+2163.936092966" Feb 17 17:18:44 crc kubenswrapper[4694]: I0217 17:18:44.618515 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:18:44 crc kubenswrapper[4694]: I0217 17:18:44.619189 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.230618 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.233363 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.241529 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.367629 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.367724 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.367849 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh66h\" (UniqueName: \"kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.469746 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.469839 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh66h\" (UniqueName: \"kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.469926 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.470449 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.470464 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.491139 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh66h\" (UniqueName: \"kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h\") pod \"certified-operators-d8mmx\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:51 crc kubenswrapper[4694]: I0217 17:18:51.570870 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:18:52 crc kubenswrapper[4694]: I0217 17:18:52.034357 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:18:52 crc kubenswrapper[4694]: I0217 17:18:52.450698 4694 generic.go:334] "Generic (PLEG): container finished" podID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerID="6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84" exitCode=0 Feb 17 17:18:52 crc kubenswrapper[4694]: I0217 17:18:52.451008 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerDied","Data":"6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84"} Feb 17 17:18:52 crc kubenswrapper[4694]: I0217 17:18:52.451037 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerStarted","Data":"5fee7fe8858e476a209e55ab508c0c358449d8ca8f5a2aa5b5876d875a42cb3f"} Feb 17 17:18:53 crc kubenswrapper[4694]: I0217 17:18:53.463402 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerStarted","Data":"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73"} Feb 17 17:18:54 crc kubenswrapper[4694]: I0217 17:18:54.493364 4694 generic.go:334] "Generic (PLEG): container finished" podID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerID="617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73" exitCode=0 Feb 17 17:18:54 crc kubenswrapper[4694]: I0217 17:18:54.493408 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerDied","Data":"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73"} Feb 17 17:18:55 crc kubenswrapper[4694]: I0217 17:18:55.502782 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerStarted","Data":"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9"} Feb 17 17:18:55 crc kubenswrapper[4694]: I0217 17:18:55.520288 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d8mmx" podStartSLOduration=2.077702933 podStartE2EDuration="4.520269906s" podCreationTimestamp="2026-02-17 17:18:51 +0000 UTC" firstStartedPulling="2026-02-17 17:18:52.452862039 +0000 UTC m=+2200.209937363" lastFinishedPulling="2026-02-17 17:18:54.895429012 +0000 UTC m=+2202.652504336" observedRunningTime="2026-02-17 17:18:55.517025725 +0000 UTC m=+2203.274101059" watchObservedRunningTime="2026-02-17 17:18:55.520269906 +0000 UTC m=+2203.277345230" Feb 17 17:19:01 crc kubenswrapper[4694]: I0217 17:19:01.571499 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:01 crc kubenswrapper[4694]: I0217 17:19:01.572209 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:01 crc kubenswrapper[4694]: I0217 17:19:01.615782 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:02 crc kubenswrapper[4694]: I0217 17:19:02.620039 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:02 crc kubenswrapper[4694]: I0217 17:19:02.674984 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:19:04 crc kubenswrapper[4694]: I0217 17:19:04.591189 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d8mmx" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="registry-server" containerID="cri-o://736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9" gracePeriod=2 Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.035574 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.132242 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities\") pod \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.132592 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh66h\" (UniqueName: \"kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h\") pod \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.132835 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content\") pod \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\" (UID: \"2d61b8c7-c935-4c72-b86b-8e6b21db25e1\") " Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.133376 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities" (OuterVolumeSpecName: "utilities") pod "2d61b8c7-c935-4c72-b86b-8e6b21db25e1" (UID: "2d61b8c7-c935-4c72-b86b-8e6b21db25e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.133559 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.138159 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h" (OuterVolumeSpecName: "kube-api-access-xh66h") pod "2d61b8c7-c935-4c72-b86b-8e6b21db25e1" (UID: "2d61b8c7-c935-4c72-b86b-8e6b21db25e1"). InnerVolumeSpecName "kube-api-access-xh66h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.179385 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d61b8c7-c935-4c72-b86b-8e6b21db25e1" (UID: "2d61b8c7-c935-4c72-b86b-8e6b21db25e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.234818 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh66h\" (UniqueName: \"kubernetes.io/projected/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-kube-api-access-xh66h\") on node \"crc\" DevicePath \"\"" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.234856 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d61b8c7-c935-4c72-b86b-8e6b21db25e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.603250 4694 generic.go:334] "Generic (PLEG): container finished" podID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerID="736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9" exitCode=0 Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.603328 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8mmx" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.603296 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerDied","Data":"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9"} Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.603965 4694 scope.go:117] "RemoveContainer" containerID="736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.604307 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8mmx" event={"ID":"2d61b8c7-c935-4c72-b86b-8e6b21db25e1","Type":"ContainerDied","Data":"5fee7fe8858e476a209e55ab508c0c358449d8ca8f5a2aa5b5876d875a42cb3f"} Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.622070 4694 scope.go:117] "RemoveContainer" containerID="617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.648702 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.652926 4694 scope.go:117] "RemoveContainer" containerID="6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.658417 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d8mmx"] Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.687385 4694 scope.go:117] "RemoveContainer" containerID="736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9" Feb 17 17:19:05 crc kubenswrapper[4694]: E0217 17:19:05.688023 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9\": container with ID starting with 736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9 not found: ID does not exist" containerID="736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.688069 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9"} err="failed to get container status \"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9\": rpc error: code = NotFound desc = could not find container \"736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9\": container with ID starting with 736bd96d7cbb4a9d25f42abb29a65b0710083a0ce5ed12940071bc6d0c86b6c9 not found: ID does not exist" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.688095 4694 scope.go:117] "RemoveContainer" containerID="617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73" Feb 17 17:19:05 crc kubenswrapper[4694]: E0217 17:19:05.688407 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73\": container with ID starting with 617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73 not found: ID does not exist" containerID="617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.688443 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73"} err="failed to get container status \"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73\": rpc error: code = NotFound desc = could not find container \"617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73\": container with ID starting with 617cc3a53b1a896a6ec622fc2d9b9e35aa9192addde57a7a61f78f322fd8ee73 not found: ID does not exist" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.688468 4694 scope.go:117] "RemoveContainer" containerID="6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84" Feb 17 17:19:05 crc kubenswrapper[4694]: E0217 17:19:05.688807 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84\": container with ID starting with 6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84 not found: ID does not exist" containerID="6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84" Feb 17 17:19:05 crc kubenswrapper[4694]: I0217 17:19:05.688827 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84"} err="failed to get container status \"6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84\": rpc error: code = NotFound desc = could not find container \"6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84\": container with ID starting with 6b092c092b87e09cc1a20b3f91adfb4f3c7cdc917674f32f9162e177e7f47a84 not found: ID does not exist" Feb 17 17:19:06 crc kubenswrapper[4694]: I0217 17:19:06.912852 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" path="/var/lib/kubelet/pods/2d61b8c7-c935-4c72-b86b-8e6b21db25e1/volumes" Feb 17 17:19:14 crc kubenswrapper[4694]: I0217 17:19:14.617870 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:19:14 crc kubenswrapper[4694]: I0217 17:19:14.618445 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.618398 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.619814 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.619869 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.620626 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.620684 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" gracePeriod=600 Feb 17 17:19:44 crc kubenswrapper[4694]: E0217 17:19:44.745120 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.955025 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" exitCode=0 Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.955076 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f"} Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.955107 4694 scope.go:117] "RemoveContainer" containerID="ec123f0ed5af2dd76240e50f17f9912ad66145b52bcd42fc0facf04b5463eb02" Feb 17 17:19:44 crc kubenswrapper[4694]: I0217 17:19:44.955568 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:19:44 crc kubenswrapper[4694]: E0217 17:19:44.955906 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:19:59 crc kubenswrapper[4694]: I0217 17:19:59.895809 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:19:59 crc kubenswrapper[4694]: E0217 17:19:59.896543 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:20:10 crc kubenswrapper[4694]: I0217 17:20:10.896726 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:20:10 crc kubenswrapper[4694]: E0217 17:20:10.897529 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:20:24 crc kubenswrapper[4694]: I0217 17:20:24.896008 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:20:24 crc kubenswrapper[4694]: E0217 17:20:24.896788 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:20:39 crc kubenswrapper[4694]: I0217 17:20:39.895157 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:20:39 crc kubenswrapper[4694]: E0217 17:20:39.895872 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:20:54 crc kubenswrapper[4694]: I0217 17:20:54.896018 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:20:54 crc kubenswrapper[4694]: E0217 17:20:54.897015 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:21:06 crc kubenswrapper[4694]: I0217 17:21:06.895412 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:21:06 crc kubenswrapper[4694]: E0217 17:21:06.896196 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:21:17 crc kubenswrapper[4694]: I0217 17:21:17.896011 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:21:17 crc kubenswrapper[4694]: E0217 17:21:17.896688 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:21:32 crc kubenswrapper[4694]: I0217 17:21:32.900673 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:21:32 crc kubenswrapper[4694]: E0217 17:21:32.901551 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:21:41 crc kubenswrapper[4694]: I0217 17:21:41.956926 4694 generic.go:334] "Generic (PLEG): container finished" podID="032fed18-d394-4743-ac9d-efa8d472bbc2" containerID="a89284264d364cc70967f6946eb287babf4b155f578491c74bafc98a8419481a" exitCode=0 Feb 17 17:21:41 crc kubenswrapper[4694]: I0217 17:21:41.957012 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" event={"ID":"032fed18-d394-4743-ac9d-efa8d472bbc2","Type":"ContainerDied","Data":"a89284264d364cc70967f6946eb287babf4b155f578491c74bafc98a8419481a"} Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.392404 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.515110 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam\") pod \"032fed18-d394-4743-ac9d-efa8d472bbc2\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.515196 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0\") pod \"032fed18-d394-4743-ac9d-efa8d472bbc2\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.515314 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle\") pod \"032fed18-d394-4743-ac9d-efa8d472bbc2\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.515356 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjmxt\" (UniqueName: \"kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt\") pod \"032fed18-d394-4743-ac9d-efa8d472bbc2\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.515401 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory\") pod \"032fed18-d394-4743-ac9d-efa8d472bbc2\" (UID: \"032fed18-d394-4743-ac9d-efa8d472bbc2\") " Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.522174 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "032fed18-d394-4743-ac9d-efa8d472bbc2" (UID: "032fed18-d394-4743-ac9d-efa8d472bbc2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.527398 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt" (OuterVolumeSpecName: "kube-api-access-rjmxt") pod "032fed18-d394-4743-ac9d-efa8d472bbc2" (UID: "032fed18-d394-4743-ac9d-efa8d472bbc2"). InnerVolumeSpecName "kube-api-access-rjmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.545420 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "032fed18-d394-4743-ac9d-efa8d472bbc2" (UID: "032fed18-d394-4743-ac9d-efa8d472bbc2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.546853 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "032fed18-d394-4743-ac9d-efa8d472bbc2" (UID: "032fed18-d394-4743-ac9d-efa8d472bbc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.558088 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory" (OuterVolumeSpecName: "inventory") pod "032fed18-d394-4743-ac9d-efa8d472bbc2" (UID: "032fed18-d394-4743-ac9d-efa8d472bbc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.618305 4694 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.618338 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjmxt\" (UniqueName: \"kubernetes.io/projected/032fed18-d394-4743-ac9d-efa8d472bbc2-kube-api-access-rjmxt\") on node \"crc\" DevicePath \"\"" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.618352 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.618365 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.618377 4694 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/032fed18-d394-4743-ac9d-efa8d472bbc2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.895364 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:21:43 crc kubenswrapper[4694]: E0217 17:21:43.895787 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.978134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" event={"ID":"032fed18-d394-4743-ac9d-efa8d472bbc2","Type":"ContainerDied","Data":"2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68"} Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.978190 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2943fa32b796f0a9e37283bf2028054af3169fb9393e7b3ffd3fa97b4711df68" Feb 17 17:21:43 crc kubenswrapper[4694]: I0217 17:21:43.978272 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.081355 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg"] Feb 17 17:21:44 crc kubenswrapper[4694]: E0217 17:21:44.081959 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="registry-server" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.081983 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="registry-server" Feb 17 17:21:44 crc kubenswrapper[4694]: E0217 17:21:44.081994 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="extract-content" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.082004 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="extract-content" Feb 17 17:21:44 crc kubenswrapper[4694]: E0217 17:21:44.082029 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032fed18-d394-4743-ac9d-efa8d472bbc2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.082038 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="032fed18-d394-4743-ac9d-efa8d472bbc2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 17:21:44 crc kubenswrapper[4694]: E0217 17:21:44.082045 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="extract-utilities" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.082052 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="extract-utilities" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.082352 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="032fed18-d394-4743-ac9d-efa8d472bbc2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.082378 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d61b8c7-c935-4c72-b86b-8e6b21db25e1" containerName="registry-server" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.083270 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.085818 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.087160 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.087626 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.087880 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.087827 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.088015 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.088407 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.092320 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg"] Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.234651 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.234755 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235274 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235422 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235517 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235643 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235785 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb89\" (UniqueName: \"kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.235920 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.236046 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.337959 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338139 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338172 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338207 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338238 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338498 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.338527 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmb89\" (UniqueName: \"kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.339097 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.339135 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.341255 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.342693 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.342822 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.343820 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.344792 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.346058 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.352871 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.354496 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.355871 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmb89\" (UniqueName: \"kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89\") pod \"nova-edpm-deployment-openstack-edpm-ipam-ddccg\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.409281 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.967189 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg"] Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.968919 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:21:44 crc kubenswrapper[4694]: I0217 17:21:44.994560 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" event={"ID":"f97bc145-9375-4a33-8b64-699355feb0fd","Type":"ContainerStarted","Data":"2a5945e3d9501f1c081c8fc39987d953a91a7169471396df466aa05c68afcfc5"} Feb 17 17:21:46 crc kubenswrapper[4694]: I0217 17:21:46.005524 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" event={"ID":"f97bc145-9375-4a33-8b64-699355feb0fd","Type":"ContainerStarted","Data":"a9749d98ef1126c16599b656b4d0ad3e0bc376cda03ffdc7c3237e1c0deeb49a"} Feb 17 17:21:46 crc kubenswrapper[4694]: I0217 17:21:46.032928 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" podStartSLOduration=1.8301466039999998 podStartE2EDuration="2.032906329s" podCreationTimestamp="2026-02-17 17:21:44 +0000 UTC" firstStartedPulling="2026-02-17 17:21:44.968731865 +0000 UTC m=+2372.725807189" lastFinishedPulling="2026-02-17 17:21:45.17149159 +0000 UTC m=+2372.928566914" observedRunningTime="2026-02-17 17:21:46.025953996 +0000 UTC m=+2373.783029320" watchObservedRunningTime="2026-02-17 17:21:46.032906329 +0000 UTC m=+2373.789981663" Feb 17 17:21:57 crc kubenswrapper[4694]: I0217 17:21:57.895448 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:21:57 crc kubenswrapper[4694]: E0217 17:21:57.896283 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:22:12 crc kubenswrapper[4694]: I0217 17:22:12.900501 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:22:12 crc kubenswrapper[4694]: E0217 17:22:12.901193 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:22:26 crc kubenswrapper[4694]: I0217 17:22:26.896096 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:22:26 crc kubenswrapper[4694]: E0217 17:22:26.896993 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:22:41 crc kubenswrapper[4694]: I0217 17:22:41.895918 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:22:41 crc kubenswrapper[4694]: E0217 17:22:41.897827 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:22:55 crc kubenswrapper[4694]: I0217 17:22:55.895291 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:22:55 crc kubenswrapper[4694]: E0217 17:22:55.896648 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:23:08 crc kubenswrapper[4694]: I0217 17:23:08.895553 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:23:08 crc kubenswrapper[4694]: E0217 17:23:08.896469 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:23:23 crc kubenswrapper[4694]: I0217 17:23:23.895399 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:23:23 crc kubenswrapper[4694]: E0217 17:23:23.896090 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:23:38 crc kubenswrapper[4694]: I0217 17:23:38.896439 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:23:38 crc kubenswrapper[4694]: E0217 17:23:38.897518 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:23:49 crc kubenswrapper[4694]: I0217 17:23:49.895782 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:23:49 crc kubenswrapper[4694]: E0217 17:23:49.897191 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:23:53 crc kubenswrapper[4694]: I0217 17:23:53.063668 4694 generic.go:334] "Generic (PLEG): container finished" podID="f97bc145-9375-4a33-8b64-699355feb0fd" containerID="a9749d98ef1126c16599b656b4d0ad3e0bc376cda03ffdc7c3237e1c0deeb49a" exitCode=0 Feb 17 17:23:53 crc kubenswrapper[4694]: I0217 17:23:53.063927 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" event={"ID":"f97bc145-9375-4a33-8b64-699355feb0fd","Type":"ContainerDied","Data":"a9749d98ef1126c16599b656b4d0ad3e0bc376cda03ffdc7c3237e1c0deeb49a"} Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.503769 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.636180 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.636346 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.636409 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.636445 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.637120 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.637354 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.637420 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.637473 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmb89\" (UniqueName: \"kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.637493 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1\") pod \"f97bc145-9375-4a33-8b64-699355feb0fd\" (UID: \"f97bc145-9375-4a33-8b64-699355feb0fd\") " Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.642302 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89" (OuterVolumeSpecName: "kube-api-access-tmb89") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "kube-api-access-tmb89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.659624 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.668909 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.673993 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.674636 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.677878 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.677933 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.680847 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.689557 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory" (OuterVolumeSpecName: "inventory") pod "f97bc145-9375-4a33-8b64-699355feb0fd" (UID: "f97bc145-9375-4a33-8b64-699355feb0fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739171 4694 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739210 4694 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739223 4694 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739235 4694 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739246 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739258 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739268 4694 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f97bc145-9375-4a33-8b64-699355feb0fd-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739280 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmb89\" (UniqueName: \"kubernetes.io/projected/f97bc145-9375-4a33-8b64-699355feb0fd-kube-api-access-tmb89\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:54 crc kubenswrapper[4694]: I0217 17:23:54.739292 4694 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f97bc145-9375-4a33-8b64-699355feb0fd-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.081350 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" event={"ID":"f97bc145-9375-4a33-8b64-699355feb0fd","Type":"ContainerDied","Data":"2a5945e3d9501f1c081c8fc39987d953a91a7169471396df466aa05c68afcfc5"} Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.081409 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5945e3d9501f1c081c8fc39987d953a91a7169471396df466aa05c68afcfc5" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.081415 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-ddccg" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.180068 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj"] Feb 17 17:23:55 crc kubenswrapper[4694]: E0217 17:23:55.180497 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f97bc145-9375-4a33-8b64-699355feb0fd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.180516 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="f97bc145-9375-4a33-8b64-699355feb0fd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.180762 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="f97bc145-9375-4a33-8b64-699355feb0fd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.181445 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.187280 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.187311 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.187443 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.187714 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.187770 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w7f6d" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.193075 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj"] Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349816 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349895 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349925 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7qt\" (UniqueName: \"kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349954 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349974 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.349995 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.350027 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.451887 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452214 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7qt\" (UniqueName: \"kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452368 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452483 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452574 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452703 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.452960 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.456074 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.456326 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.456368 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.457155 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.458490 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.468524 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.472969 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7qt\" (UniqueName: \"kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:55 crc kubenswrapper[4694]: I0217 17:23:55.508368 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:23:56 crc kubenswrapper[4694]: I0217 17:23:56.016930 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj"] Feb 17 17:23:56 crc kubenswrapper[4694]: I0217 17:23:56.089004 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" event={"ID":"fe57a3c1-260f-4f46-b977-82656c0ad9d6","Type":"ContainerStarted","Data":"6a1e04cbbd8c013bef7abb11c6eb92ac392375a6f5446947264afdd13af43f3c"} Feb 17 17:23:57 crc kubenswrapper[4694]: I0217 17:23:57.117413 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" event={"ID":"fe57a3c1-260f-4f46-b977-82656c0ad9d6","Type":"ContainerStarted","Data":"045f179ffda588615c2744288dc9858bb573961304a1c87cd833492b0e909ee9"} Feb 17 17:23:57 crc kubenswrapper[4694]: I0217 17:23:57.139550 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" podStartSLOduration=1.941781084 podStartE2EDuration="2.139531983s" podCreationTimestamp="2026-02-17 17:23:55 +0000 UTC" firstStartedPulling="2026-02-17 17:23:56.030449452 +0000 UTC m=+2503.787524786" lastFinishedPulling="2026-02-17 17:23:56.228200351 +0000 UTC m=+2503.985275685" observedRunningTime="2026-02-17 17:23:57.131413991 +0000 UTC m=+2504.888489315" watchObservedRunningTime="2026-02-17 17:23:57.139531983 +0000 UTC m=+2504.896607307" Feb 17 17:24:01 crc kubenswrapper[4694]: I0217 17:24:01.896225 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:24:01 crc kubenswrapper[4694]: E0217 17:24:01.897105 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:24:12 crc kubenswrapper[4694]: I0217 17:24:12.902638 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:24:12 crc kubenswrapper[4694]: E0217 17:24:12.903822 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:24:25 crc kubenswrapper[4694]: I0217 17:24:25.895371 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:24:25 crc kubenswrapper[4694]: E0217 17:24:25.896316 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:24:36 crc kubenswrapper[4694]: I0217 17:24:36.896365 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:24:36 crc kubenswrapper[4694]: E0217 17:24:36.897222 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:24:51 crc kubenswrapper[4694]: I0217 17:24:51.896082 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:24:52 crc kubenswrapper[4694]: I0217 17:24:52.644760 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d"} Feb 17 17:26:11 crc kubenswrapper[4694]: I0217 17:26:11.621906 4694 generic.go:334] "Generic (PLEG): container finished" podID="fe57a3c1-260f-4f46-b977-82656c0ad9d6" containerID="045f179ffda588615c2744288dc9858bb573961304a1c87cd833492b0e909ee9" exitCode=0 Feb 17 17:26:11 crc kubenswrapper[4694]: I0217 17:26:11.622056 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" event={"ID":"fe57a3c1-260f-4f46-b977-82656c0ad9d6","Type":"ContainerDied","Data":"045f179ffda588615c2744288dc9858bb573961304a1c87cd833492b0e909ee9"} Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.077283 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.128874 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.128942 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.128993 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.129034 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv7qt\" (UniqueName: \"kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.129098 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.129140 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.129159 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1\") pod \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\" (UID: \"fe57a3c1-260f-4f46-b977-82656c0ad9d6\") " Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.143930 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt" (OuterVolumeSpecName: "kube-api-access-tv7qt") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "kube-api-access-tv7qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.187202 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.199264 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.202894 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.240588 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.240859 4694 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.240957 4694 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.241061 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv7qt\" (UniqueName: \"kubernetes.io/projected/fe57a3c1-260f-4f46-b977-82656c0ad9d6-kube-api-access-tv7qt\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.259723 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory" (OuterVolumeSpecName: "inventory") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.290159 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.291799 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "fe57a3c1-260f-4f46-b977-82656c0ad9d6" (UID: "fe57a3c1-260f-4f46-b977-82656c0ad9d6"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.341828 4694 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.341861 4694 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.341872 4694 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe57a3c1-260f-4f46-b977-82656c0ad9d6-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.639099 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" event={"ID":"fe57a3c1-260f-4f46-b977-82656c0ad9d6","Type":"ContainerDied","Data":"6a1e04cbbd8c013bef7abb11c6eb92ac392375a6f5446947264afdd13af43f3c"} Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.639345 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1e04cbbd8c013bef7abb11c6eb92ac392375a6f5446947264afdd13af43f3c" Feb 17 17:26:13 crc kubenswrapper[4694]: I0217 17:26:13.639137 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.022496 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:16 crc kubenswrapper[4694]: E0217 17:26:16.024123 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe57a3c1-260f-4f46-b977-82656c0ad9d6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.024173 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe57a3c1-260f-4f46-b977-82656c0ad9d6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.024537 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe57a3c1-260f-4f46-b977-82656c0ad9d6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.027814 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.057838 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.198947 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.199007 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.199427 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.300838 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.300994 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.301038 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.301395 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.301462 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.323325 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf\") pod \"redhat-marketplace-gh8r5\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.346138 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:16 crc kubenswrapper[4694]: I0217 17:26:16.824385 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:17 crc kubenswrapper[4694]: I0217 17:26:17.680307 4694 generic.go:334] "Generic (PLEG): container finished" podID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerID="47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f" exitCode=0 Feb 17 17:26:17 crc kubenswrapper[4694]: I0217 17:26:17.680439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerDied","Data":"47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f"} Feb 17 17:26:17 crc kubenswrapper[4694]: I0217 17:26:17.680673 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerStarted","Data":"ba5fab3605c645eafdb94462d16712ff1576f1fed2d64991a259418e1913bbed"} Feb 17 17:26:18 crc kubenswrapper[4694]: I0217 17:26:18.690057 4694 generic.go:334] "Generic (PLEG): container finished" podID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerID="e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499" exitCode=0 Feb 17 17:26:18 crc kubenswrapper[4694]: I0217 17:26:18.690271 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerDied","Data":"e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499"} Feb 17 17:26:19 crc kubenswrapper[4694]: I0217 17:26:19.703301 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerStarted","Data":"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8"} Feb 17 17:26:19 crc kubenswrapper[4694]: I0217 17:26:19.726029 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gh8r5" podStartSLOduration=3.359781999 podStartE2EDuration="4.726010682s" podCreationTimestamp="2026-02-17 17:26:15 +0000 UTC" firstStartedPulling="2026-02-17 17:26:17.68287405 +0000 UTC m=+2645.439949404" lastFinishedPulling="2026-02-17 17:26:19.049102753 +0000 UTC m=+2646.806178087" observedRunningTime="2026-02-17 17:26:19.724387391 +0000 UTC m=+2647.481462755" watchObservedRunningTime="2026-02-17 17:26:19.726010682 +0000 UTC m=+2647.483086026" Feb 17 17:26:26 crc kubenswrapper[4694]: I0217 17:26:26.346988 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:26 crc kubenswrapper[4694]: I0217 17:26:26.347765 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:26 crc kubenswrapper[4694]: I0217 17:26:26.412448 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:26 crc kubenswrapper[4694]: I0217 17:26:26.816106 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:26 crc kubenswrapper[4694]: I0217 17:26:26.876115 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:28 crc kubenswrapper[4694]: I0217 17:26:28.788418 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gh8r5" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="registry-server" containerID="cri-o://5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8" gracePeriod=2 Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.230532 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.356382 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf\") pod \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.356513 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content\") pod \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.356658 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities\") pod \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\" (UID: \"7c41fa51-d324-4c66-9cc5-3d781c1163e2\") " Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.357722 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities" (OuterVolumeSpecName: "utilities") pod "7c41fa51-d324-4c66-9cc5-3d781c1163e2" (UID: "7c41fa51-d324-4c66-9cc5-3d781c1163e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.361888 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf" (OuterVolumeSpecName: "kube-api-access-jwggf") pod "7c41fa51-d324-4c66-9cc5-3d781c1163e2" (UID: "7c41fa51-d324-4c66-9cc5-3d781c1163e2"). InnerVolumeSpecName "kube-api-access-jwggf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.382626 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c41fa51-d324-4c66-9cc5-3d781c1163e2" (UID: "7c41fa51-d324-4c66-9cc5-3d781c1163e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.459486 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/7c41fa51-d324-4c66-9cc5-3d781c1163e2-kube-api-access-jwggf\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.459578 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.459600 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c41fa51-d324-4c66-9cc5-3d781c1163e2-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.800074 4694 generic.go:334] "Generic (PLEG): container finished" podID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerID="5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8" exitCode=0 Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.800479 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerDied","Data":"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8"} Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.800542 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh8r5" event={"ID":"7c41fa51-d324-4c66-9cc5-3d781c1163e2","Type":"ContainerDied","Data":"ba5fab3605c645eafdb94462d16712ff1576f1fed2d64991a259418e1913bbed"} Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.800567 4694 scope.go:117] "RemoveContainer" containerID="5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.800814 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh8r5" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.822405 4694 scope.go:117] "RemoveContainer" containerID="e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.842761 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.850932 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh8r5"] Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.861055 4694 scope.go:117] "RemoveContainer" containerID="47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.891332 4694 scope.go:117] "RemoveContainer" containerID="5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8" Feb 17 17:26:29 crc kubenswrapper[4694]: E0217 17:26:29.891754 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8\": container with ID starting with 5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8 not found: ID does not exist" containerID="5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.891810 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8"} err="failed to get container status \"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8\": rpc error: code = NotFound desc = could not find container \"5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8\": container with ID starting with 5fd0f28924664dc37a0d664db24214fb7b854c502081bd4bf5edebe2277ff0c8 not found: ID does not exist" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.891832 4694 scope.go:117] "RemoveContainer" containerID="e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499" Feb 17 17:26:29 crc kubenswrapper[4694]: E0217 17:26:29.892129 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499\": container with ID starting with e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499 not found: ID does not exist" containerID="e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.892175 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499"} err="failed to get container status \"e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499\": rpc error: code = NotFound desc = could not find container \"e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499\": container with ID starting with e09840d0b89f9802625f7aebad9f7164676af971eb4f01d88a96465693719499 not found: ID does not exist" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.892204 4694 scope.go:117] "RemoveContainer" containerID="47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f" Feb 17 17:26:29 crc kubenswrapper[4694]: E0217 17:26:29.892464 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f\": container with ID starting with 47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f not found: ID does not exist" containerID="47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f" Feb 17 17:26:29 crc kubenswrapper[4694]: I0217 17:26:29.892490 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f"} err="failed to get container status \"47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f\": rpc error: code = NotFound desc = could not find container \"47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f\": container with ID starting with 47f0ccc734553006d4e68227d81597dc8d8372dc28a76bd9710ae64c80c8dc5f not found: ID does not exist" Feb 17 17:26:30 crc kubenswrapper[4694]: I0217 17:26:30.906896 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" path="/var/lib/kubelet/pods/7c41fa51-d324-4c66-9cc5-3d781c1163e2/volumes" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.324058 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 17:27:11 crc kubenswrapper[4694]: E0217 17:27:11.325550 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="extract-utilities" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.325604 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="extract-utilities" Feb 17 17:27:11 crc kubenswrapper[4694]: E0217 17:27:11.325701 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="extract-content" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.325720 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="extract-content" Feb 17 17:27:11 crc kubenswrapper[4694]: E0217 17:27:11.325769 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="registry-server" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.325786 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="registry-server" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.326244 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c41fa51-d324-4c66-9cc5-3d781c1163e2" containerName="registry-server" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.327989 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.330428 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.330563 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.330691 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qgff7" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.330923 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.334501 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.418840 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zcx6\" (UniqueName: \"kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.418927 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419002 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419108 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419134 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419316 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419414 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419491 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.419554 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.522136 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.522646 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zcx6\" (UniqueName: \"kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.522851 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523070 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523373 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523565 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523806 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523989 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.524180 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.523834 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.524283 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.524197 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.524533 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.525035 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.531481 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.531552 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.532354 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.545675 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zcx6\" (UniqueName: \"kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.562201 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " pod="openstack/tempest-tests-tempest" Feb 17 17:27:11 crc kubenswrapper[4694]: I0217 17:27:11.648298 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 17:27:12 crc kubenswrapper[4694]: I0217 17:27:12.099469 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 17:27:12 crc kubenswrapper[4694]: W0217 17:27:12.107006 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a4a02dc_9cc2_4445_9624_359734b69ae6.slice/crio-7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3 WatchSource:0}: Error finding container 7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3: Status 404 returned error can't find the container with id 7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3 Feb 17 17:27:12 crc kubenswrapper[4694]: I0217 17:27:12.110599 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:27:12 crc kubenswrapper[4694]: I0217 17:27:12.228307 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5a4a02dc-9cc2-4445-9624-359734b69ae6","Type":"ContainerStarted","Data":"7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3"} Feb 17 17:27:14 crc kubenswrapper[4694]: I0217 17:27:14.617542 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:27:14 crc kubenswrapper[4694]: I0217 17:27:14.617976 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.534954 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.538028 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.565849 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.651548 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.651668 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgk5x\" (UniqueName: \"kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.651696 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.753168 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.753295 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgk5x\" (UniqueName: \"kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.753320 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.753786 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.754091 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.778177 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgk5x\" (UniqueName: \"kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x\") pod \"redhat-operators-6dl68\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:22 crc kubenswrapper[4694]: I0217 17:27:22.890637 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:23 crc kubenswrapper[4694]: I0217 17:27:23.393391 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:27:43 crc kubenswrapper[4694]: E0217 17:27:43.109712 4694 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 17 17:27:43 crc kubenswrapper[4694]: E0217 17:27:43.110229 4694 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zcx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5a4a02dc-9cc2-4445-9624-359734b69ae6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 17:27:43 crc kubenswrapper[4694]: E0217 17:27:43.111439 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5a4a02dc-9cc2-4445-9624-359734b69ae6" Feb 17 17:27:43 crc kubenswrapper[4694]: I0217 17:27:43.524959 4694 generic.go:334] "Generic (PLEG): container finished" podID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerID="4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea" exitCode=0 Feb 17 17:27:43 crc kubenswrapper[4694]: I0217 17:27:43.525017 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerDied","Data":"4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea"} Feb 17 17:27:43 crc kubenswrapper[4694]: I0217 17:27:43.525048 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerStarted","Data":"7efe90aac2b8e020ac0bc69b17eca5d15fcab49eb754d53d08158b1a205e8ebc"} Feb 17 17:27:43 crc kubenswrapper[4694]: E0217 17:27:43.526413 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5a4a02dc-9cc2-4445-9624-359734b69ae6" Feb 17 17:27:44 crc kubenswrapper[4694]: I0217 17:27:44.535589 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerStarted","Data":"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5"} Feb 17 17:27:44 crc kubenswrapper[4694]: I0217 17:27:44.618177 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:27:44 crc kubenswrapper[4694]: I0217 17:27:44.618234 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:27:45 crc kubenswrapper[4694]: I0217 17:27:45.549659 4694 generic.go:334] "Generic (PLEG): container finished" podID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerID="5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5" exitCode=0 Feb 17 17:27:45 crc kubenswrapper[4694]: I0217 17:27:45.549945 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerDied","Data":"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5"} Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.318795 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.321166 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.334666 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.341975 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.342070 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.342105 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zr4t\" (UniqueName: \"kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.443911 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.444012 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.444043 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zr4t\" (UniqueName: \"kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.444589 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.444632 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.469720 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zr4t\" (UniqueName: \"kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t\") pod \"community-operators-nmwcv\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.561204 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerStarted","Data":"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f"} Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.582561 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6dl68" podStartSLOduration=22.035450096 podStartE2EDuration="24.582544251s" podCreationTimestamp="2026-02-17 17:27:22 +0000 UTC" firstStartedPulling="2026-02-17 17:27:43.526379735 +0000 UTC m=+2731.283455059" lastFinishedPulling="2026-02-17 17:27:46.07347389 +0000 UTC m=+2733.830549214" observedRunningTime="2026-02-17 17:27:46.577345422 +0000 UTC m=+2734.334420746" watchObservedRunningTime="2026-02-17 17:27:46.582544251 +0000 UTC m=+2734.339619575" Feb 17 17:27:46 crc kubenswrapper[4694]: I0217 17:27:46.644094 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:47 crc kubenswrapper[4694]: I0217 17:27:47.234694 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:27:47 crc kubenswrapper[4694]: W0217 17:27:47.236407 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda770253b_4ce4_495a_814a_9632210f98ce.slice/crio-4e006be1095389132d31cf07dc0832bc56ad6551dff56153970b2254c299a652 WatchSource:0}: Error finding container 4e006be1095389132d31cf07dc0832bc56ad6551dff56153970b2254c299a652: Status 404 returned error can't find the container with id 4e006be1095389132d31cf07dc0832bc56ad6551dff56153970b2254c299a652 Feb 17 17:27:47 crc kubenswrapper[4694]: I0217 17:27:47.570093 4694 generic.go:334] "Generic (PLEG): container finished" podID="a770253b-4ce4-495a-814a-9632210f98ce" containerID="1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b" exitCode=0 Feb 17 17:27:47 crc kubenswrapper[4694]: I0217 17:27:47.570197 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerDied","Data":"1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b"} Feb 17 17:27:47 crc kubenswrapper[4694]: I0217 17:27:47.570234 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerStarted","Data":"4e006be1095389132d31cf07dc0832bc56ad6551dff56153970b2254c299a652"} Feb 17 17:27:48 crc kubenswrapper[4694]: I0217 17:27:48.579699 4694 generic.go:334] "Generic (PLEG): container finished" podID="a770253b-4ce4-495a-814a-9632210f98ce" containerID="6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea" exitCode=0 Feb 17 17:27:48 crc kubenswrapper[4694]: I0217 17:27:48.579783 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerDied","Data":"6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea"} Feb 17 17:27:49 crc kubenswrapper[4694]: I0217 17:27:49.590640 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerStarted","Data":"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f"} Feb 17 17:27:49 crc kubenswrapper[4694]: I0217 17:27:49.617510 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nmwcv" podStartSLOduration=2.173524972 podStartE2EDuration="3.617492601s" podCreationTimestamp="2026-02-17 17:27:46 +0000 UTC" firstStartedPulling="2026-02-17 17:27:47.571750435 +0000 UTC m=+2735.328825759" lastFinishedPulling="2026-02-17 17:27:49.015718064 +0000 UTC m=+2736.772793388" observedRunningTime="2026-02-17 17:27:49.612646511 +0000 UTC m=+2737.369721845" watchObservedRunningTime="2026-02-17 17:27:49.617492601 +0000 UTC m=+2737.374567915" Feb 17 17:27:52 crc kubenswrapper[4694]: I0217 17:27:52.891630 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:52 crc kubenswrapper[4694]: I0217 17:27:52.892152 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:27:53 crc kubenswrapper[4694]: I0217 17:27:53.935945 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6dl68" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="registry-server" probeResult="failure" output=< Feb 17 17:27:53 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 17:27:53 crc kubenswrapper[4694]: > Feb 17 17:27:55 crc kubenswrapper[4694]: I0217 17:27:55.367191 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 17:27:56 crc kubenswrapper[4694]: I0217 17:27:56.644790 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:56 crc kubenswrapper[4694]: I0217 17:27:56.645307 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:56 crc kubenswrapper[4694]: I0217 17:27:56.648922 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5a4a02dc-9cc2-4445-9624-359734b69ae6","Type":"ContainerStarted","Data":"cda421b3aded1b01032177260cc97ca56500daf0a7290ecbc79e2373584cdbcd"} Feb 17 17:27:56 crc kubenswrapper[4694]: I0217 17:27:56.668305 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.413539101 podStartE2EDuration="46.668284801s" podCreationTimestamp="2026-02-17 17:27:10 +0000 UTC" firstStartedPulling="2026-02-17 17:27:12.11039304 +0000 UTC m=+2699.867468364" lastFinishedPulling="2026-02-17 17:27:55.36513874 +0000 UTC m=+2743.122214064" observedRunningTime="2026-02-17 17:27:56.667538983 +0000 UTC m=+2744.424614307" watchObservedRunningTime="2026-02-17 17:27:56.668284801 +0000 UTC m=+2744.425360125" Feb 17 17:27:56 crc kubenswrapper[4694]: I0217 17:27:56.705703 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:57 crc kubenswrapper[4694]: I0217 17:27:57.705431 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:27:57 crc kubenswrapper[4694]: I0217 17:27:57.767145 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:27:59 crc kubenswrapper[4694]: I0217 17:27:59.674218 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nmwcv" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="registry-server" containerID="cri-o://1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f" gracePeriod=2 Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.215217 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.332682 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zr4t\" (UniqueName: \"kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t\") pod \"a770253b-4ce4-495a-814a-9632210f98ce\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.332828 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content\") pod \"a770253b-4ce4-495a-814a-9632210f98ce\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.332864 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities\") pod \"a770253b-4ce4-495a-814a-9632210f98ce\" (UID: \"a770253b-4ce4-495a-814a-9632210f98ce\") " Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.333749 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities" (OuterVolumeSpecName: "utilities") pod "a770253b-4ce4-495a-814a-9632210f98ce" (UID: "a770253b-4ce4-495a-814a-9632210f98ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.338551 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t" (OuterVolumeSpecName: "kube-api-access-6zr4t") pod "a770253b-4ce4-495a-814a-9632210f98ce" (UID: "a770253b-4ce4-495a-814a-9632210f98ce"). InnerVolumeSpecName "kube-api-access-6zr4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.386088 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a770253b-4ce4-495a-814a-9632210f98ce" (UID: "a770253b-4ce4-495a-814a-9632210f98ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.435899 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.435931 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a770253b-4ce4-495a-814a-9632210f98ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.435946 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zr4t\" (UniqueName: \"kubernetes.io/projected/a770253b-4ce4-495a-814a-9632210f98ce-kube-api-access-6zr4t\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.683255 4694 generic.go:334] "Generic (PLEG): container finished" podID="a770253b-4ce4-495a-814a-9632210f98ce" containerID="1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f" exitCode=0 Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.683341 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmwcv" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.683355 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerDied","Data":"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f"} Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.683790 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmwcv" event={"ID":"a770253b-4ce4-495a-814a-9632210f98ce","Type":"ContainerDied","Data":"4e006be1095389132d31cf07dc0832bc56ad6551dff56153970b2254c299a652"} Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.683810 4694 scope.go:117] "RemoveContainer" containerID="1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.725734 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.728794 4694 scope.go:117] "RemoveContainer" containerID="6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.739753 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nmwcv"] Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.765592 4694 scope.go:117] "RemoveContainer" containerID="1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.815235 4694 scope.go:117] "RemoveContainer" containerID="1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f" Feb 17 17:28:00 crc kubenswrapper[4694]: E0217 17:28:00.815702 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f\": container with ID starting with 1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f not found: ID does not exist" containerID="1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.815734 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f"} err="failed to get container status \"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f\": rpc error: code = NotFound desc = could not find container \"1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f\": container with ID starting with 1677474e42bf4f84c1b9f07a01abd49f491c9c63e06a3697a66d987c47fcad5f not found: ID does not exist" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.815757 4694 scope.go:117] "RemoveContainer" containerID="6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea" Feb 17 17:28:00 crc kubenswrapper[4694]: E0217 17:28:00.817477 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea\": container with ID starting with 6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea not found: ID does not exist" containerID="6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.817671 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea"} err="failed to get container status \"6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea\": rpc error: code = NotFound desc = could not find container \"6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea\": container with ID starting with 6dad163ac44fec8dea34adb67d7fa19ab359f62ed02e7a62f97cb6d1970ddbea not found: ID does not exist" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.817710 4694 scope.go:117] "RemoveContainer" containerID="1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b" Feb 17 17:28:00 crc kubenswrapper[4694]: E0217 17:28:00.818019 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b\": container with ID starting with 1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b not found: ID does not exist" containerID="1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.818059 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b"} err="failed to get container status \"1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b\": rpc error: code = NotFound desc = could not find container \"1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b\": container with ID starting with 1fe00a3a216a4a3a8a784e89faa7e7395466a164226c7008cdb712f400030a0b not found: ID does not exist" Feb 17 17:28:00 crc kubenswrapper[4694]: I0217 17:28:00.907012 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a770253b-4ce4-495a-814a-9632210f98ce" path="/var/lib/kubelet/pods/a770253b-4ce4-495a-814a-9632210f98ce/volumes" Feb 17 17:28:02 crc kubenswrapper[4694]: I0217 17:28:02.954326 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:28:03 crc kubenswrapper[4694]: I0217 17:28:03.021183 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:28:03 crc kubenswrapper[4694]: I0217 17:28:03.455451 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:28:04 crc kubenswrapper[4694]: I0217 17:28:04.726206 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6dl68" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="registry-server" containerID="cri-o://5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f" gracePeriod=2 Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.183319 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.323298 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgk5x\" (UniqueName: \"kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x\") pod \"617e7914-dfcb-4943-975b-7c7e62f08f59\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.323344 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content\") pod \"617e7914-dfcb-4943-975b-7c7e62f08f59\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.323386 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities\") pod \"617e7914-dfcb-4943-975b-7c7e62f08f59\" (UID: \"617e7914-dfcb-4943-975b-7c7e62f08f59\") " Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.324013 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities" (OuterVolumeSpecName: "utilities") pod "617e7914-dfcb-4943-975b-7c7e62f08f59" (UID: "617e7914-dfcb-4943-975b-7c7e62f08f59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.328696 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x" (OuterVolumeSpecName: "kube-api-access-zgk5x") pod "617e7914-dfcb-4943-975b-7c7e62f08f59" (UID: "617e7914-dfcb-4943-975b-7c7e62f08f59"). InnerVolumeSpecName "kube-api-access-zgk5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.426074 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgk5x\" (UniqueName: \"kubernetes.io/projected/617e7914-dfcb-4943-975b-7c7e62f08f59-kube-api-access-zgk5x\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.426114 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.443700 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "617e7914-dfcb-4943-975b-7c7e62f08f59" (UID: "617e7914-dfcb-4943-975b-7c7e62f08f59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.527860 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/617e7914-dfcb-4943-975b-7c7e62f08f59-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.741916 4694 generic.go:334] "Generic (PLEG): container finished" podID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerID="5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f" exitCode=0 Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.742018 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dl68" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.742063 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerDied","Data":"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f"} Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.742675 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dl68" event={"ID":"617e7914-dfcb-4943-975b-7c7e62f08f59","Type":"ContainerDied","Data":"7efe90aac2b8e020ac0bc69b17eca5d15fcab49eb754d53d08158b1a205e8ebc"} Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.742709 4694 scope.go:117] "RemoveContainer" containerID="5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.783784 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.784071 4694 scope.go:117] "RemoveContainer" containerID="5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.791588 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6dl68"] Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.828341 4694 scope.go:117] "RemoveContainer" containerID="4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.871306 4694 scope.go:117] "RemoveContainer" containerID="5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f" Feb 17 17:28:05 crc kubenswrapper[4694]: E0217 17:28:05.871899 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f\": container with ID starting with 5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f not found: ID does not exist" containerID="5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.871944 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f"} err="failed to get container status \"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f\": rpc error: code = NotFound desc = could not find container \"5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f\": container with ID starting with 5f207ee84e3c328c5ea5310dd985435a11dfe6311e7dfacf63f35abe0b408f7f not found: ID does not exist" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.871975 4694 scope.go:117] "RemoveContainer" containerID="5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5" Feb 17 17:28:05 crc kubenswrapper[4694]: E0217 17:28:05.873487 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5\": container with ID starting with 5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5 not found: ID does not exist" containerID="5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.873515 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5"} err="failed to get container status \"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5\": rpc error: code = NotFound desc = could not find container \"5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5\": container with ID starting with 5feaacb15630bf7b585e081551fcf95902b8e49798064afc034cf4ea782a36c5 not found: ID does not exist" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.873529 4694 scope.go:117] "RemoveContainer" containerID="4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea" Feb 17 17:28:05 crc kubenswrapper[4694]: E0217 17:28:05.874244 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea\": container with ID starting with 4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea not found: ID does not exist" containerID="4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea" Feb 17 17:28:05 crc kubenswrapper[4694]: I0217 17:28:05.874264 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea"} err="failed to get container status \"4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea\": rpc error: code = NotFound desc = could not find container \"4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea\": container with ID starting with 4cbddebf35c9457384ca4dd76c460471822d8a97affb8306b5848b53904d1cea not found: ID does not exist" Feb 17 17:28:06 crc kubenswrapper[4694]: I0217 17:28:06.906093 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" path="/var/lib/kubelet/pods/617e7914-dfcb-4943-975b-7c7e62f08f59/volumes" Feb 17 17:28:14 crc kubenswrapper[4694]: I0217 17:28:14.617826 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:28:14 crc kubenswrapper[4694]: I0217 17:28:14.618816 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:28:14 crc kubenswrapper[4694]: I0217 17:28:14.618900 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:28:14 crc kubenswrapper[4694]: I0217 17:28:14.834503 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:28:14 crc kubenswrapper[4694]: I0217 17:28:14.834573 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d" gracePeriod=600 Feb 17 17:28:15 crc kubenswrapper[4694]: I0217 17:28:15.847103 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d" exitCode=0 Feb 17 17:28:15 crc kubenswrapper[4694]: I0217 17:28:15.847183 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d"} Feb 17 17:28:15 crc kubenswrapper[4694]: I0217 17:28:15.847878 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62"} Feb 17 17:28:15 crc kubenswrapper[4694]: I0217 17:28:15.847905 4694 scope.go:117] "RemoveContainer" containerID="87a5125a17c5dfe383044c4794a0856d92e69b40441f8e661bffeb11baced37f" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.051840 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052791 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="extract-content" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052806 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="extract-content" Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052816 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="extract-utilities" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052822 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="extract-utilities" Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052837 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052843 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052852 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="extract-content" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052858 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="extract-content" Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052870 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="extract-utilities" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052876 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="extract-utilities" Feb 17 17:29:05 crc kubenswrapper[4694]: E0217 17:29:05.052888 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.052894 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.053059 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="617e7914-dfcb-4943-975b-7c7e62f08f59" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.053080 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a770253b-4ce4-495a-814a-9632210f98ce" containerName="registry-server" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.054380 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.065395 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.166141 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.166479 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.166698 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x2b6\" (UniqueName: \"kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.268038 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x2b6\" (UniqueName: \"kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.268131 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.268215 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.268696 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.268964 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.289388 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x2b6\" (UniqueName: \"kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6\") pod \"certified-operators-nfjd8\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.409745 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:05 crc kubenswrapper[4694]: I0217 17:29:05.999399 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:06 crc kubenswrapper[4694]: I0217 17:29:06.326907 4694 generic.go:334] "Generic (PLEG): container finished" podID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerID="23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93" exitCode=0 Feb 17 17:29:06 crc kubenswrapper[4694]: I0217 17:29:06.326961 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerDied","Data":"23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93"} Feb 17 17:29:06 crc kubenswrapper[4694]: I0217 17:29:06.327229 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerStarted","Data":"663aee7662e7db464c75629701d9004733584bb7c70bbfeb9bf67533408fb4d5"} Feb 17 17:29:07 crc kubenswrapper[4694]: I0217 17:29:07.336480 4694 generic.go:334] "Generic (PLEG): container finished" podID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerID="5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9" exitCode=0 Feb 17 17:29:07 crc kubenswrapper[4694]: I0217 17:29:07.336804 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerDied","Data":"5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9"} Feb 17 17:29:07 crc kubenswrapper[4694]: E0217 17:29:07.471179 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd8e2ce_524a_4285_8cce_78b7df0af95f.slice/crio-5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd8e2ce_524a_4285_8cce_78b7df0af95f.slice/crio-conmon-5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:29:08 crc kubenswrapper[4694]: I0217 17:29:08.347114 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerStarted","Data":"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113"} Feb 17 17:29:08 crc kubenswrapper[4694]: I0217 17:29:08.369182 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nfjd8" podStartSLOduration=1.975387883 podStartE2EDuration="3.369157548s" podCreationTimestamp="2026-02-17 17:29:05 +0000 UTC" firstStartedPulling="2026-02-17 17:29:06.329574115 +0000 UTC m=+2814.086649469" lastFinishedPulling="2026-02-17 17:29:07.72334381 +0000 UTC m=+2815.480419134" observedRunningTime="2026-02-17 17:29:08.363106928 +0000 UTC m=+2816.120182252" watchObservedRunningTime="2026-02-17 17:29:08.369157548 +0000 UTC m=+2816.126232872" Feb 17 17:29:15 crc kubenswrapper[4694]: I0217 17:29:15.409874 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:15 crc kubenswrapper[4694]: I0217 17:29:15.410683 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:15 crc kubenswrapper[4694]: I0217 17:29:15.479877 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:16 crc kubenswrapper[4694]: I0217 17:29:16.474761 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:16 crc kubenswrapper[4694]: I0217 17:29:16.536381 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:18 crc kubenswrapper[4694]: I0217 17:29:18.433784 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nfjd8" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="registry-server" containerID="cri-o://0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113" gracePeriod=2 Feb 17 17:29:18 crc kubenswrapper[4694]: I0217 17:29:18.991200 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.076811 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x2b6\" (UniqueName: \"kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6\") pod \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.076865 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities\") pod \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.076993 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content\") pod \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\" (UID: \"5fd8e2ce-524a-4285-8cce-78b7df0af95f\") " Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.078751 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities" (OuterVolumeSpecName: "utilities") pod "5fd8e2ce-524a-4285-8cce-78b7df0af95f" (UID: "5fd8e2ce-524a-4285-8cce-78b7df0af95f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.083032 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6" (OuterVolumeSpecName: "kube-api-access-5x2b6") pod "5fd8e2ce-524a-4285-8cce-78b7df0af95f" (UID: "5fd8e2ce-524a-4285-8cce-78b7df0af95f"). InnerVolumeSpecName "kube-api-access-5x2b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.125155 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fd8e2ce-524a-4285-8cce-78b7df0af95f" (UID: "5fd8e2ce-524a-4285-8cce-78b7df0af95f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.179514 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x2b6\" (UniqueName: \"kubernetes.io/projected/5fd8e2ce-524a-4285-8cce-78b7df0af95f-kube-api-access-5x2b6\") on node \"crc\" DevicePath \"\"" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.179553 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.179562 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd8e2ce-524a-4285-8cce-78b7df0af95f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.444095 4694 generic.go:334] "Generic (PLEG): container finished" podID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerID="0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113" exitCode=0 Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.444139 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerDied","Data":"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113"} Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.444439 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfjd8" event={"ID":"5fd8e2ce-524a-4285-8cce-78b7df0af95f","Type":"ContainerDied","Data":"663aee7662e7db464c75629701d9004733584bb7c70bbfeb9bf67533408fb4d5"} Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.444181 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfjd8" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.444454 4694 scope.go:117] "RemoveContainer" containerID="0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.464409 4694 scope.go:117] "RemoveContainer" containerID="5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.482755 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.495588 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nfjd8"] Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.516879 4694 scope.go:117] "RemoveContainer" containerID="23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.542140 4694 scope.go:117] "RemoveContainer" containerID="0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113" Feb 17 17:29:19 crc kubenswrapper[4694]: E0217 17:29:19.542557 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113\": container with ID starting with 0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113 not found: ID does not exist" containerID="0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.542597 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113"} err="failed to get container status \"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113\": rpc error: code = NotFound desc = could not find container \"0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113\": container with ID starting with 0aafdff7b3866d6750139986da7f69e0aec0ce3fca27811a75d7329362576113 not found: ID does not exist" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.542664 4694 scope.go:117] "RemoveContainer" containerID="5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9" Feb 17 17:29:19 crc kubenswrapper[4694]: E0217 17:29:19.543184 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9\": container with ID starting with 5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9 not found: ID does not exist" containerID="5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.543222 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9"} err="failed to get container status \"5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9\": rpc error: code = NotFound desc = could not find container \"5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9\": container with ID starting with 5bf1831351d3d0884f696c35277bd37f90ece505e8b67c36d0d0434a41cfa1f9 not found: ID does not exist" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.543249 4694 scope.go:117] "RemoveContainer" containerID="23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93" Feb 17 17:29:19 crc kubenswrapper[4694]: E0217 17:29:19.543505 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93\": container with ID starting with 23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93 not found: ID does not exist" containerID="23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93" Feb 17 17:29:19 crc kubenswrapper[4694]: I0217 17:29:19.543532 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93"} err="failed to get container status \"23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93\": rpc error: code = NotFound desc = could not find container \"23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93\": container with ID starting with 23da47caf605d51c0ace02906055abbc58ee20a5a71d1d1e70b6a23e74e2fe93 not found: ID does not exist" Feb 17 17:29:20 crc kubenswrapper[4694]: I0217 17:29:20.904661 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" path="/var/lib/kubelet/pods/5fd8e2ce-524a-4285-8cce-78b7df0af95f/volumes" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.148948 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw"] Feb 17 17:30:00 crc kubenswrapper[4694]: E0217 17:30:00.149996 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="extract-content" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.150013 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="extract-content" Feb 17 17:30:00 crc kubenswrapper[4694]: E0217 17:30:00.150026 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="extract-utilities" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.150036 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="extract-utilities" Feb 17 17:30:00 crc kubenswrapper[4694]: E0217 17:30:00.150067 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="registry-server" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.150074 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="registry-server" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.150250 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd8e2ce-524a-4285-8cce-78b7df0af95f" containerName="registry-server" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.150967 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.155780 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.156056 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.163355 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw"] Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.286583 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.286697 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.286763 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jt2\" (UniqueName: \"kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.388129 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.388188 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jt2\" (UniqueName: \"kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.388325 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.389122 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.394034 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.406155 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jt2\" (UniqueName: \"kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2\") pod \"collect-profiles-29522490-mnkzw\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.485105 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:00 crc kubenswrapper[4694]: I0217 17:30:00.941171 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw"] Feb 17 17:30:01 crc kubenswrapper[4694]: I0217 17:30:01.870966 4694 generic.go:334] "Generic (PLEG): container finished" podID="e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" containerID="0b66761e599ddc7443c211f8e92467fc8c3955e7e444ef6015ba76241b3a78eb" exitCode=0 Feb 17 17:30:01 crc kubenswrapper[4694]: I0217 17:30:01.871109 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" event={"ID":"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f","Type":"ContainerDied","Data":"0b66761e599ddc7443c211f8e92467fc8c3955e7e444ef6015ba76241b3a78eb"} Feb 17 17:30:01 crc kubenswrapper[4694]: I0217 17:30:01.871307 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" event={"ID":"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f","Type":"ContainerStarted","Data":"312f6e6f5317f7b1eafa1a56bd0575a6a2a9d0920406cadb45d248b0b45f6bb7"} Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.261501 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.349033 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume\") pod \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.349200 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume\") pod \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.349225 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96jt2\" (UniqueName: \"kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2\") pod \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\" (UID: \"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f\") " Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.350580 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume" (OuterVolumeSpecName: "config-volume") pod "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" (UID: "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.354359 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" (UID: "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.354589 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2" (OuterVolumeSpecName: "kube-api-access-96jt2") pod "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" (UID: "e5cf9b8e-99de-4cdb-83e9-b38f7438e67f"). InnerVolumeSpecName "kube-api-access-96jt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.451883 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.451920 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96jt2\" (UniqueName: \"kubernetes.io/projected/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-kube-api-access-96jt2\") on node \"crc\" DevicePath \"\"" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.451932 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e5cf9b8e-99de-4cdb-83e9-b38f7438e67f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.888971 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" event={"ID":"e5cf9b8e-99de-4cdb-83e9-b38f7438e67f","Type":"ContainerDied","Data":"312f6e6f5317f7b1eafa1a56bd0575a6a2a9d0920406cadb45d248b0b45f6bb7"} Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.889011 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="312f6e6f5317f7b1eafa1a56bd0575a6a2a9d0920406cadb45d248b0b45f6bb7" Feb 17 17:30:03 crc kubenswrapper[4694]: I0217 17:30:03.889051 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522490-mnkzw" Feb 17 17:30:04 crc kubenswrapper[4694]: I0217 17:30:04.342023 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb"] Feb 17 17:30:04 crc kubenswrapper[4694]: I0217 17:30:04.352121 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522445-plgcb"] Feb 17 17:30:04 crc kubenswrapper[4694]: I0217 17:30:04.906903 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b786d2-03c3-4db5-b5b2-f8f79e30efd4" path="/var/lib/kubelet/pods/50b786d2-03c3-4db5-b5b2-f8f79e30efd4/volumes" Feb 17 17:30:43 crc kubenswrapper[4694]: I0217 17:30:43.149100 4694 scope.go:117] "RemoveContainer" containerID="f934634bcbf6bba16f0d59ee82f75e360f2f7669864dfdac04fdb2e3f0ff9546" Feb 17 17:30:44 crc kubenswrapper[4694]: I0217 17:30:44.618135 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:30:44 crc kubenswrapper[4694]: I0217 17:30:44.618459 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:31:14 crc kubenswrapper[4694]: I0217 17:31:14.618295 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:31:14 crc kubenswrapper[4694]: I0217 17:31:14.618825 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.618184 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.618804 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.618859 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.619763 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.619834 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" gracePeriod=600 Feb 17 17:31:44 crc kubenswrapper[4694]: E0217 17:31:44.770742 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.841332 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" exitCode=0 Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.841381 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62"} Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.841434 4694 scope.go:117] "RemoveContainer" containerID="7e4c65ae75daf038be895421e845d23b212b799247ca1ff722bd87391856051d" Feb 17 17:31:44 crc kubenswrapper[4694]: I0217 17:31:44.842144 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:31:44 crc kubenswrapper[4694]: E0217 17:31:44.842432 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:31:58 crc kubenswrapper[4694]: I0217 17:31:58.896048 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:31:58 crc kubenswrapper[4694]: E0217 17:31:58.896753 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:32:11 crc kubenswrapper[4694]: I0217 17:32:11.896160 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:32:11 crc kubenswrapper[4694]: E0217 17:32:11.897137 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:32:22 crc kubenswrapper[4694]: I0217 17:32:22.903746 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:32:22 crc kubenswrapper[4694]: E0217 17:32:22.904552 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:32:33 crc kubenswrapper[4694]: I0217 17:32:33.896664 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:32:33 crc kubenswrapper[4694]: E0217 17:32:33.897912 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:32:46 crc kubenswrapper[4694]: I0217 17:32:46.896570 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:32:46 crc kubenswrapper[4694]: E0217 17:32:46.897780 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:32:58 crc kubenswrapper[4694]: I0217 17:32:58.895254 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:32:58 crc kubenswrapper[4694]: E0217 17:32:58.896048 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:33:09 crc kubenswrapper[4694]: I0217 17:33:09.895687 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:33:09 crc kubenswrapper[4694]: E0217 17:33:09.896552 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:33:20 crc kubenswrapper[4694]: I0217 17:33:20.895493 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:33:20 crc kubenswrapper[4694]: E0217 17:33:20.896271 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:33:35 crc kubenswrapper[4694]: I0217 17:33:35.896110 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:33:35 crc kubenswrapper[4694]: E0217 17:33:35.897446 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:33:46 crc kubenswrapper[4694]: I0217 17:33:46.895791 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:33:46 crc kubenswrapper[4694]: E0217 17:33:46.896593 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:34:01 crc kubenswrapper[4694]: I0217 17:34:01.896134 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:34:01 crc kubenswrapper[4694]: E0217 17:34:01.898491 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:34:13 crc kubenswrapper[4694]: I0217 17:34:13.895970 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:34:13 crc kubenswrapper[4694]: E0217 17:34:13.897336 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:34:24 crc kubenswrapper[4694]: I0217 17:34:24.895431 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:34:24 crc kubenswrapper[4694]: E0217 17:34:24.896226 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:34:39 crc kubenswrapper[4694]: I0217 17:34:39.895123 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:34:39 crc kubenswrapper[4694]: E0217 17:34:39.895923 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:34:50 crc kubenswrapper[4694]: I0217 17:34:50.895374 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:34:50 crc kubenswrapper[4694]: E0217 17:34:50.896410 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:35:02 crc kubenswrapper[4694]: I0217 17:35:02.896112 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:35:02 crc kubenswrapper[4694]: E0217 17:35:02.897246 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:35:14 crc kubenswrapper[4694]: I0217 17:35:14.895797 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:35:14 crc kubenswrapper[4694]: E0217 17:35:14.896795 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:35:29 crc kubenswrapper[4694]: I0217 17:35:29.895732 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:35:29 crc kubenswrapper[4694]: E0217 17:35:29.897741 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:35:41 crc kubenswrapper[4694]: I0217 17:35:41.895294 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:35:41 crc kubenswrapper[4694]: E0217 17:35:41.896153 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:35:54 crc kubenswrapper[4694]: I0217 17:35:54.896151 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:35:54 crc kubenswrapper[4694]: E0217 17:35:54.897026 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:36:06 crc kubenswrapper[4694]: I0217 17:36:06.895817 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:36:06 crc kubenswrapper[4694]: E0217 17:36:06.896529 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:36:20 crc kubenswrapper[4694]: I0217 17:36:20.896214 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:36:20 crc kubenswrapper[4694]: E0217 17:36:20.897029 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.332769 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:29 crc kubenswrapper[4694]: E0217 17:36:29.334201 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" containerName="collect-profiles" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.334241 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" containerName="collect-profiles" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.334598 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5cf9b8e-99de-4cdb-83e9-b38f7438e67f" containerName="collect-profiles" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.336679 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.343425 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.437229 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.437562 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f426m\" (UniqueName: \"kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.437713 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.539675 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.539755 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f426m\" (UniqueName: \"kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.539802 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.540267 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.540930 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.566496 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f426m\" (UniqueName: \"kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m\") pod \"redhat-marketplace-nwkn5\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:29 crc kubenswrapper[4694]: I0217 17:36:29.675309 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:30 crc kubenswrapper[4694]: I0217 17:36:30.202744 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:30 crc kubenswrapper[4694]: I0217 17:36:30.565144 4694 generic.go:334] "Generic (PLEG): container finished" podID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerID="cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1" exitCode=0 Feb 17 17:36:30 crc kubenswrapper[4694]: I0217 17:36:30.565188 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerDied","Data":"cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1"} Feb 17 17:36:30 crc kubenswrapper[4694]: I0217 17:36:30.565214 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerStarted","Data":"ba8281f562baf4710befc7eef87a9409cc43d7d27a4b29c97b66ad53ff40a4c7"} Feb 17 17:36:30 crc kubenswrapper[4694]: I0217 17:36:30.566944 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:36:31 crc kubenswrapper[4694]: I0217 17:36:31.586122 4694 generic.go:334] "Generic (PLEG): container finished" podID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerID="95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179" exitCode=0 Feb 17 17:36:31 crc kubenswrapper[4694]: I0217 17:36:31.586333 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerDied","Data":"95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179"} Feb 17 17:36:32 crc kubenswrapper[4694]: I0217 17:36:32.606342 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerStarted","Data":"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89"} Feb 17 17:36:32 crc kubenswrapper[4694]: I0217 17:36:32.635799 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nwkn5" podStartSLOduration=2.008076096 podStartE2EDuration="3.635779763s" podCreationTimestamp="2026-02-17 17:36:29 +0000 UTC" firstStartedPulling="2026-02-17 17:36:30.566579976 +0000 UTC m=+3258.323655300" lastFinishedPulling="2026-02-17 17:36:32.194283643 +0000 UTC m=+3259.951358967" observedRunningTime="2026-02-17 17:36:32.630221325 +0000 UTC m=+3260.387296659" watchObservedRunningTime="2026-02-17 17:36:32.635779763 +0000 UTC m=+3260.392855087" Feb 17 17:36:35 crc kubenswrapper[4694]: I0217 17:36:35.896390 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:36:35 crc kubenswrapper[4694]: E0217 17:36:35.897304 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:36:39 crc kubenswrapper[4694]: I0217 17:36:39.676134 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:39 crc kubenswrapper[4694]: I0217 17:36:39.676788 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:39 crc kubenswrapper[4694]: I0217 17:36:39.721501 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:40 crc kubenswrapper[4694]: I0217 17:36:40.722822 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:40 crc kubenswrapper[4694]: I0217 17:36:40.764460 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:42 crc kubenswrapper[4694]: I0217 17:36:42.689927 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nwkn5" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="registry-server" containerID="cri-o://91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89" gracePeriod=2 Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.181677 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.311418 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f426m\" (UniqueName: \"kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m\") pod \"a012779d-0000-442f-a6be-48e9c00ce0fd\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.311683 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content\") pod \"a012779d-0000-442f-a6be-48e9c00ce0fd\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.311781 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities\") pod \"a012779d-0000-442f-a6be-48e9c00ce0fd\" (UID: \"a012779d-0000-442f-a6be-48e9c00ce0fd\") " Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.312795 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities" (OuterVolumeSpecName: "utilities") pod "a012779d-0000-442f-a6be-48e9c00ce0fd" (UID: "a012779d-0000-442f-a6be-48e9c00ce0fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.318996 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m" (OuterVolumeSpecName: "kube-api-access-f426m") pod "a012779d-0000-442f-a6be-48e9c00ce0fd" (UID: "a012779d-0000-442f-a6be-48e9c00ce0fd"). InnerVolumeSpecName "kube-api-access-f426m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.337258 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a012779d-0000-442f-a6be-48e9c00ce0fd" (UID: "a012779d-0000-442f-a6be-48e9c00ce0fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.414943 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.414977 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f426m\" (UniqueName: \"kubernetes.io/projected/a012779d-0000-442f-a6be-48e9c00ce0fd-kube-api-access-f426m\") on node \"crc\" DevicePath \"\"" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.414994 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a012779d-0000-442f-a6be-48e9c00ce0fd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.698573 4694 generic.go:334] "Generic (PLEG): container finished" podID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerID="91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89" exitCode=0 Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.698626 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerDied","Data":"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89"} Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.698878 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwkn5" event={"ID":"a012779d-0000-442f-a6be-48e9c00ce0fd","Type":"ContainerDied","Data":"ba8281f562baf4710befc7eef87a9409cc43d7d27a4b29c97b66ad53ff40a4c7"} Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.698677 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwkn5" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.698901 4694 scope.go:117] "RemoveContainer" containerID="91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.719875 4694 scope.go:117] "RemoveContainer" containerID="95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.731086 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.741168 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwkn5"] Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.763280 4694 scope.go:117] "RemoveContainer" containerID="cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.808942 4694 scope.go:117] "RemoveContainer" containerID="91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89" Feb 17 17:36:43 crc kubenswrapper[4694]: E0217 17:36:43.809482 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89\": container with ID starting with 91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89 not found: ID does not exist" containerID="91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.809557 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89"} err="failed to get container status \"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89\": rpc error: code = NotFound desc = could not find container \"91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89\": container with ID starting with 91d380057492daec87414afc0e9f8443744e35ff3843d3f99f09b5cc69cd5c89 not found: ID does not exist" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.809589 4694 scope.go:117] "RemoveContainer" containerID="95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179" Feb 17 17:36:43 crc kubenswrapper[4694]: E0217 17:36:43.810110 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179\": container with ID starting with 95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179 not found: ID does not exist" containerID="95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.810156 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179"} err="failed to get container status \"95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179\": rpc error: code = NotFound desc = could not find container \"95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179\": container with ID starting with 95a53e02ee7e52670402fa27422d9e127f62d4b9b76ce5f5ff760ddfbcfa5179 not found: ID does not exist" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.810185 4694 scope.go:117] "RemoveContainer" containerID="cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1" Feb 17 17:36:43 crc kubenswrapper[4694]: E0217 17:36:43.810576 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1\": container with ID starting with cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1 not found: ID does not exist" containerID="cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1" Feb 17 17:36:43 crc kubenswrapper[4694]: I0217 17:36:43.810600 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1"} err="failed to get container status \"cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1\": rpc error: code = NotFound desc = could not find container \"cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1\": container with ID starting with cfdd296c2ff11631649ae9bd0d39e95dae1f3c1a8bf87abb2f282c4fae2398a1 not found: ID does not exist" Feb 17 17:36:44 crc kubenswrapper[4694]: I0217 17:36:44.914085 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" path="/var/lib/kubelet/pods/a012779d-0000-442f-a6be-48e9c00ce0fd/volumes" Feb 17 17:36:46 crc kubenswrapper[4694]: I0217 17:36:46.895928 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:36:47 crc kubenswrapper[4694]: I0217 17:36:47.769856 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777"} Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.400050 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:37:46 crc kubenswrapper[4694]: E0217 17:37:46.401078 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="registry-server" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.401095 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="registry-server" Feb 17 17:37:46 crc kubenswrapper[4694]: E0217 17:37:46.401128 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="extract-content" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.401138 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="extract-content" Feb 17 17:37:46 crc kubenswrapper[4694]: E0217 17:37:46.401152 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="extract-utilities" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.401161 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="extract-utilities" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.401378 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="a012779d-0000-442f-a6be-48e9c00ce0fd" containerName="registry-server" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.403048 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.411215 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.562980 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjp5\" (UniqueName: \"kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.563068 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.563345 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.665888 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjp5\" (UniqueName: \"kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.665953 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.666011 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.666515 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.666630 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.688067 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjp5\" (UniqueName: \"kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5\") pod \"community-operators-hmn2w\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:46 crc kubenswrapper[4694]: I0217 17:37:46.726566 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:47 crc kubenswrapper[4694]: I0217 17:37:47.298345 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:37:48 crc kubenswrapper[4694]: I0217 17:37:48.332876 4694 generic.go:334] "Generic (PLEG): container finished" podID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerID="a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e" exitCode=0 Feb 17 17:37:48 crc kubenswrapper[4694]: I0217 17:37:48.333315 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerDied","Data":"a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e"} Feb 17 17:37:48 crc kubenswrapper[4694]: I0217 17:37:48.333393 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerStarted","Data":"41d66f4f73e3018084d923b5d2ff022f50ed68e7423854b7d853d27529a817cd"} Feb 17 17:37:49 crc kubenswrapper[4694]: I0217 17:37:49.344485 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerStarted","Data":"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d"} Feb 17 17:37:49 crc kubenswrapper[4694]: E0217 17:37:49.506727 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e16e694_b6d3_4873_815e_b57a3d2bfa3d.slice/crio-fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:37:50 crc kubenswrapper[4694]: I0217 17:37:50.363906 4694 generic.go:334] "Generic (PLEG): container finished" podID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerID="fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d" exitCode=0 Feb 17 17:37:50 crc kubenswrapper[4694]: I0217 17:37:50.364053 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerDied","Data":"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d"} Feb 17 17:37:51 crc kubenswrapper[4694]: I0217 17:37:51.379074 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerStarted","Data":"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c"} Feb 17 17:37:51 crc kubenswrapper[4694]: I0217 17:37:51.398474 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hmn2w" podStartSLOduration=2.971453313 podStartE2EDuration="5.398456475s" podCreationTimestamp="2026-02-17 17:37:46 +0000 UTC" firstStartedPulling="2026-02-17 17:37:48.336073722 +0000 UTC m=+3336.093149056" lastFinishedPulling="2026-02-17 17:37:50.763076884 +0000 UTC m=+3338.520152218" observedRunningTime="2026-02-17 17:37:51.394548698 +0000 UTC m=+3339.151624022" watchObservedRunningTime="2026-02-17 17:37:51.398456475 +0000 UTC m=+3339.155531799" Feb 17 17:37:56 crc kubenswrapper[4694]: I0217 17:37:56.727584 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:56 crc kubenswrapper[4694]: I0217 17:37:56.729418 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:56 crc kubenswrapper[4694]: I0217 17:37:56.774414 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:57 crc kubenswrapper[4694]: I0217 17:37:57.498282 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:37:57 crc kubenswrapper[4694]: I0217 17:37:57.550140 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:37:59 crc kubenswrapper[4694]: I0217 17:37:59.453971 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hmn2w" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="registry-server" containerID="cri-o://c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c" gracePeriod=2 Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.011008 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.135569 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zjp5\" (UniqueName: \"kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5\") pod \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.135950 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content\") pod \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.136146 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities\") pod \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\" (UID: \"0e16e694-b6d3-4873-815e-b57a3d2bfa3d\") " Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.136660 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities" (OuterVolumeSpecName: "utilities") pod "0e16e694-b6d3-4873-815e-b57a3d2bfa3d" (UID: "0e16e694-b6d3-4873-815e-b57a3d2bfa3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.141680 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5" (OuterVolumeSpecName: "kube-api-access-5zjp5") pod "0e16e694-b6d3-4873-815e-b57a3d2bfa3d" (UID: "0e16e694-b6d3-4873-815e-b57a3d2bfa3d"). InnerVolumeSpecName "kube-api-access-5zjp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.194070 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e16e694-b6d3-4873-815e-b57a3d2bfa3d" (UID: "0e16e694-b6d3-4873-815e-b57a3d2bfa3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.238765 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.238811 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zjp5\" (UniqueName: \"kubernetes.io/projected/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-kube-api-access-5zjp5\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.238859 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e16e694-b6d3-4873-815e-b57a3d2bfa3d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.465675 4694 generic.go:334] "Generic (PLEG): container finished" podID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerID="c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c" exitCode=0 Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.465729 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerDied","Data":"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c"} Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.465759 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmn2w" event={"ID":"0e16e694-b6d3-4873-815e-b57a3d2bfa3d","Type":"ContainerDied","Data":"41d66f4f73e3018084d923b5d2ff022f50ed68e7423854b7d853d27529a817cd"} Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.465762 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmn2w" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.465777 4694 scope.go:117] "RemoveContainer" containerID="c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.498899 4694 scope.go:117] "RemoveContainer" containerID="fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.515096 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.525346 4694 scope.go:117] "RemoveContainer" containerID="a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.532324 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hmn2w"] Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.562140 4694 scope.go:117] "RemoveContainer" containerID="c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c" Feb 17 17:38:00 crc kubenswrapper[4694]: E0217 17:38:00.563216 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c\": container with ID starting with c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c not found: ID does not exist" containerID="c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.563278 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c"} err="failed to get container status \"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c\": rpc error: code = NotFound desc = could not find container \"c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c\": container with ID starting with c6897f89edecffbfa3695a6861a2d326431c4bf3ea48b91980094412102cf23c not found: ID does not exist" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.563312 4694 scope.go:117] "RemoveContainer" containerID="fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d" Feb 17 17:38:00 crc kubenswrapper[4694]: E0217 17:38:00.563845 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d\": container with ID starting with fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d not found: ID does not exist" containerID="fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.563893 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d"} err="failed to get container status \"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d\": rpc error: code = NotFound desc = could not find container \"fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d\": container with ID starting with fae110b68871b666528887a86881c670992c3f2aac85fbcb50d8ea3e20b47c4d not found: ID does not exist" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.563925 4694 scope.go:117] "RemoveContainer" containerID="a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e" Feb 17 17:38:00 crc kubenswrapper[4694]: E0217 17:38:00.564881 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e\": container with ID starting with a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e not found: ID does not exist" containerID="a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.564916 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e"} err="failed to get container status \"a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e\": rpc error: code = NotFound desc = could not find container \"a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e\": container with ID starting with a8a9c00efc89275bf3e84c2e003a14e4bec67b76daf001c257ea5e822c3e4e7e not found: ID does not exist" Feb 17 17:38:00 crc kubenswrapper[4694]: I0217 17:38:00.918873 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" path="/var/lib/kubelet/pods/0e16e694-b6d3-4873-815e-b57a3d2bfa3d/volumes" Feb 17 17:38:14 crc kubenswrapper[4694]: I0217 17:38:14.589822 4694 generic.go:334] "Generic (PLEG): container finished" podID="5a4a02dc-9cc2-4445-9624-359734b69ae6" containerID="cda421b3aded1b01032177260cc97ca56500daf0a7290ecbc79e2373584cdbcd" exitCode=0 Feb 17 17:38:14 crc kubenswrapper[4694]: I0217 17:38:14.589910 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5a4a02dc-9cc2-4445-9624-359734b69ae6","Type":"ContainerDied","Data":"cda421b3aded1b01032177260cc97ca56500daf0a7290ecbc79e2373584cdbcd"} Feb 17 17:38:15 crc kubenswrapper[4694]: I0217 17:38:15.980579 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027139 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027251 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027294 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027337 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027361 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027418 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zcx6\" (UniqueName: \"kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027455 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027523 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.027559 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir\") pod \"5a4a02dc-9cc2-4445-9624-359734b69ae6\" (UID: \"5a4a02dc-9cc2-4445-9624-359734b69ae6\") " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.029797 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.030136 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data" (OuterVolumeSpecName: "config-data") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.033851 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.035374 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6" (OuterVolumeSpecName: "kube-api-access-5zcx6") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "kube-api-access-5zcx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.036018 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.062371 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.073121 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.074211 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.110325 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5a4a02dc-9cc2-4445-9624-359734b69ae6" (UID: "5a4a02dc-9cc2-4445-9624-359734b69ae6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129779 4694 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129820 4694 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129832 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129842 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zcx6\" (UniqueName: \"kubernetes.io/projected/5a4a02dc-9cc2-4445-9624-359734b69ae6-kube-api-access-5zcx6\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129852 4694 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129860 4694 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a4a02dc-9cc2-4445-9624-359734b69ae6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129870 4694 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5a4a02dc-9cc2-4445-9624-359734b69ae6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129878 4694 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.129887 4694 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a4a02dc-9cc2-4445-9624-359734b69ae6-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.147287 4694 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.231275 4694 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.608364 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5a4a02dc-9cc2-4445-9624-359734b69ae6","Type":"ContainerDied","Data":"7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3"} Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.608418 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7912e6512b66893cce9646ebf25ff1945a55086b07cd9f3d80577fd8a4fbb9e3" Feb 17 17:38:16 crc kubenswrapper[4694]: I0217 17:38:16.608431 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.198678 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:22 crc kubenswrapper[4694]: E0217 17:38:22.199657 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4a02dc-9cc2-4445-9624-359734b69ae6" containerName="tempest-tests-tempest-tests-runner" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.199672 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4a02dc-9cc2-4445-9624-359734b69ae6" containerName="tempest-tests-tempest-tests-runner" Feb 17 17:38:22 crc kubenswrapper[4694]: E0217 17:38:22.199695 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="registry-server" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.199703 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="registry-server" Feb 17 17:38:22 crc kubenswrapper[4694]: E0217 17:38:22.199718 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="extract-content" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.199727 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="extract-content" Feb 17 17:38:22 crc kubenswrapper[4694]: E0217 17:38:22.199772 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="extract-utilities" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.199780 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="extract-utilities" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.200027 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4a02dc-9cc2-4445-9624-359734b69ae6" containerName="tempest-tests-tempest-tests-runner" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.200055 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e16e694-b6d3-4873-815e-b57a3d2bfa3d" containerName="registry-server" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.201849 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.220229 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.253014 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmwzz\" (UniqueName: \"kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.253105 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.253669 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.355954 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.356026 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmwzz\" (UniqueName: \"kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.356091 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.356485 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.356588 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.375578 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmwzz\" (UniqueName: \"kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz\") pod \"redhat-operators-n5dqp\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:22 crc kubenswrapper[4694]: I0217 17:38:22.531537 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:23 crc kubenswrapper[4694]: I0217 17:38:23.014504 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:23 crc kubenswrapper[4694]: I0217 17:38:23.680137 4694 generic.go:334] "Generic (PLEG): container finished" podID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerID="c1848defa918bb930c59781933c91215aad4a8f4f32704bb6deb1128371db53c" exitCode=0 Feb 17 17:38:23 crc kubenswrapper[4694]: I0217 17:38:23.680233 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerDied","Data":"c1848defa918bb930c59781933c91215aad4a8f4f32704bb6deb1128371db53c"} Feb 17 17:38:23 crc kubenswrapper[4694]: I0217 17:38:23.680428 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerStarted","Data":"fdf52a29c53c7972e1e141fafe4596c4173fe91be08743320262b81dbc379c05"} Feb 17 17:38:24 crc kubenswrapper[4694]: I0217 17:38:24.691418 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerStarted","Data":"14a21416a74d9fc6ef2f2c0d8ac0c7a6dba3e1784b4557ef35bbd8cd1994394e"} Feb 17 17:38:25 crc kubenswrapper[4694]: I0217 17:38:25.701812 4694 generic.go:334] "Generic (PLEG): container finished" podID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerID="14a21416a74d9fc6ef2f2c0d8ac0c7a6dba3e1784b4557ef35bbd8cd1994394e" exitCode=0 Feb 17 17:38:25 crc kubenswrapper[4694]: I0217 17:38:25.702169 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerDied","Data":"14a21416a74d9fc6ef2f2c0d8ac0c7a6dba3e1784b4557ef35bbd8cd1994394e"} Feb 17 17:38:26 crc kubenswrapper[4694]: I0217 17:38:26.711252 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerStarted","Data":"0936fa80ae1e36489ea190b02c2be4fdabee1961146fb9f69ce26510dc36e7f4"} Feb 17 17:38:26 crc kubenswrapper[4694]: I0217 17:38:26.733981 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5dqp" podStartSLOduration=2.341957895 podStartE2EDuration="4.733960404s" podCreationTimestamp="2026-02-17 17:38:22 +0000 UTC" firstStartedPulling="2026-02-17 17:38:23.681675733 +0000 UTC m=+3371.438751057" lastFinishedPulling="2026-02-17 17:38:26.073678242 +0000 UTC m=+3373.830753566" observedRunningTime="2026-02-17 17:38:26.726638932 +0000 UTC m=+3374.483714256" watchObservedRunningTime="2026-02-17 17:38:26.733960404 +0000 UTC m=+3374.491035738" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.019726 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.021658 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.031689 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qgff7" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.034623 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.155353 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.155747 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdn8k\" (UniqueName: \"kubernetes.io/projected/6cf976c8-b5ae-4b82-ad19-5d28b6196b80-kube-api-access-cdn8k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.258315 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.258433 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdn8k\" (UniqueName: \"kubernetes.io/projected/6cf976c8-b5ae-4b82-ad19-5d28b6196b80-kube-api-access-cdn8k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.259251 4694 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.294149 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdn8k\" (UniqueName: \"kubernetes.io/projected/6cf976c8-b5ae-4b82-ad19-5d28b6196b80-kube-api-access-cdn8k\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.296922 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6cf976c8-b5ae-4b82-ad19-5d28b6196b80\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.341466 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 17:38:27 crc kubenswrapper[4694]: I0217 17:38:27.796084 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 17:38:28 crc kubenswrapper[4694]: I0217 17:38:28.729076 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6cf976c8-b5ae-4b82-ad19-5d28b6196b80","Type":"ContainerStarted","Data":"250c97143da63d37e78e60cb330aa30fa56f014cc664ce570f12cfa7307a6f06"} Feb 17 17:38:29 crc kubenswrapper[4694]: I0217 17:38:29.742394 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6cf976c8-b5ae-4b82-ad19-5d28b6196b80","Type":"ContainerStarted","Data":"ac2e23f7527246802e8d55848c3440629ba9ee123419513779a2b08ebddef1f7"} Feb 17 17:38:29 crc kubenswrapper[4694]: I0217 17:38:29.756293 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.854605735 podStartE2EDuration="2.756273129s" podCreationTimestamp="2026-02-17 17:38:27 +0000 UTC" firstStartedPulling="2026-02-17 17:38:27.800721662 +0000 UTC m=+3375.557796996" lastFinishedPulling="2026-02-17 17:38:28.702389066 +0000 UTC m=+3376.459464390" observedRunningTime="2026-02-17 17:38:29.755895588 +0000 UTC m=+3377.512970912" watchObservedRunningTime="2026-02-17 17:38:29.756273129 +0000 UTC m=+3377.513348473" Feb 17 17:38:32 crc kubenswrapper[4694]: I0217 17:38:32.531833 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:32 crc kubenswrapper[4694]: I0217 17:38:32.532243 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:32 crc kubenswrapper[4694]: I0217 17:38:32.608194 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:32 crc kubenswrapper[4694]: I0217 17:38:32.809277 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:32 crc kubenswrapper[4694]: I0217 17:38:32.854258 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:34 crc kubenswrapper[4694]: I0217 17:38:34.779988 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5dqp" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="registry-server" containerID="cri-o://0936fa80ae1e36489ea190b02c2be4fdabee1961146fb9f69ce26510dc36e7f4" gracePeriod=2 Feb 17 17:38:36 crc kubenswrapper[4694]: I0217 17:38:36.805421 4694 generic.go:334] "Generic (PLEG): container finished" podID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerID="0936fa80ae1e36489ea190b02c2be4fdabee1961146fb9f69ce26510dc36e7f4" exitCode=0 Feb 17 17:38:36 crc kubenswrapper[4694]: I0217 17:38:36.805490 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerDied","Data":"0936fa80ae1e36489ea190b02c2be4fdabee1961146fb9f69ce26510dc36e7f4"} Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.178577 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.263807 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities\") pod \"d17e4877-e840-4c64-9a64-39c6a0f9af34\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.263858 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmwzz\" (UniqueName: \"kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz\") pod \"d17e4877-e840-4c64-9a64-39c6a0f9af34\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.264141 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content\") pod \"d17e4877-e840-4c64-9a64-39c6a0f9af34\" (UID: \"d17e4877-e840-4c64-9a64-39c6a0f9af34\") " Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.265097 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities" (OuterVolumeSpecName: "utilities") pod "d17e4877-e840-4c64-9a64-39c6a0f9af34" (UID: "d17e4877-e840-4c64-9a64-39c6a0f9af34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.272818 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz" (OuterVolumeSpecName: "kube-api-access-jmwzz") pod "d17e4877-e840-4c64-9a64-39c6a0f9af34" (UID: "d17e4877-e840-4c64-9a64-39c6a0f9af34"). InnerVolumeSpecName "kube-api-access-jmwzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.366402 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.366430 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmwzz\" (UniqueName: \"kubernetes.io/projected/d17e4877-e840-4c64-9a64-39c6a0f9af34-kube-api-access-jmwzz\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.415102 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d17e4877-e840-4c64-9a64-39c6a0f9af34" (UID: "d17e4877-e840-4c64-9a64-39c6a0f9af34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.468483 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d17e4877-e840-4c64-9a64-39c6a0f9af34-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.817649 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5dqp" event={"ID":"d17e4877-e840-4c64-9a64-39c6a0f9af34","Type":"ContainerDied","Data":"fdf52a29c53c7972e1e141fafe4596c4173fe91be08743320262b81dbc379c05"} Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.817708 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5dqp" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.817973 4694 scope.go:117] "RemoveContainer" containerID="0936fa80ae1e36489ea190b02c2be4fdabee1961146fb9f69ce26510dc36e7f4" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.841868 4694 scope.go:117] "RemoveContainer" containerID="14a21416a74d9fc6ef2f2c0d8ac0c7a6dba3e1784b4557ef35bbd8cd1994394e" Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.866710 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.882296 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5dqp"] Feb 17 17:38:37 crc kubenswrapper[4694]: I0217 17:38:37.898313 4694 scope.go:117] "RemoveContainer" containerID="c1848defa918bb930c59781933c91215aad4a8f4f32704bb6deb1128371db53c" Feb 17 17:38:38 crc kubenswrapper[4694]: I0217 17:38:38.910113 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" path="/var/lib/kubelet/pods/d17e4877-e840-4c64-9a64-39c6a0f9af34/volumes" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.639842 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p2hq/must-gather-5sqgk"] Feb 17 17:38:50 crc kubenswrapper[4694]: E0217 17:38:50.640583 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="extract-utilities" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.640593 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="extract-utilities" Feb 17 17:38:50 crc kubenswrapper[4694]: E0217 17:38:50.640629 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="extract-content" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.640636 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="extract-content" Feb 17 17:38:50 crc kubenswrapper[4694]: E0217 17:38:50.640646 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="registry-server" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.640652 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="registry-server" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.640833 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17e4877-e840-4c64-9a64-39c6a0f9af34" containerName="registry-server" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.641730 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.646379 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p2hq"/"kube-root-ca.crt" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.646380 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7p2hq"/"default-dockercfg-nds5j" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.646481 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p2hq"/"openshift-service-ca.crt" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.686889 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p2hq/must-gather-5sqgk"] Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.720839 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkg7m\" (UniqueName: \"kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.720900 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.823085 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkg7m\" (UniqueName: \"kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.823153 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.823597 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.850001 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkg7m\" (UniqueName: \"kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m\") pod \"must-gather-5sqgk\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:50 crc kubenswrapper[4694]: I0217 17:38:50.989135 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:38:51 crc kubenswrapper[4694]: I0217 17:38:51.439549 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p2hq/must-gather-5sqgk"] Feb 17 17:38:51 crc kubenswrapper[4694]: I0217 17:38:51.966574 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" event={"ID":"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8","Type":"ContainerStarted","Data":"1985f4e3468a195af24738fb77692a6dcb1d58ecc6235d8c429590bc9838f7b2"} Feb 17 17:38:58 crc kubenswrapper[4694]: I0217 17:38:58.022989 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" event={"ID":"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8","Type":"ContainerStarted","Data":"bfee9f0fdc222eb2749abfe06e68fa8e24c211c2c3f9a3f4b916b58f473c6a73"} Feb 17 17:38:58 crc kubenswrapper[4694]: I0217 17:38:58.023515 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" event={"ID":"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8","Type":"ContainerStarted","Data":"3222e896501f3ea336bc55bf14cc29a3edd518bb0aa840ddc7cc9a2952e000fc"} Feb 17 17:38:58 crc kubenswrapper[4694]: I0217 17:38:58.045940 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" podStartSLOduration=2.284127286 podStartE2EDuration="8.045919428s" podCreationTimestamp="2026-02-17 17:38:50 +0000 UTC" firstStartedPulling="2026-02-17 17:38:51.439070747 +0000 UTC m=+3399.196146061" lastFinishedPulling="2026-02-17 17:38:57.200862879 +0000 UTC m=+3404.957938203" observedRunningTime="2026-02-17 17:38:58.037972008 +0000 UTC m=+3405.795047332" watchObservedRunningTime="2026-02-17 17:38:58.045919428 +0000 UTC m=+3405.802994752" Feb 17 17:39:00 crc kubenswrapper[4694]: I0217 17:39:00.851084 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-pcmbs"] Feb 17 17:39:00 crc kubenswrapper[4694]: I0217 17:39:00.852575 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:00 crc kubenswrapper[4694]: I0217 17:39:00.917027 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkn6c\" (UniqueName: \"kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:00 crc kubenswrapper[4694]: I0217 17:39:00.917437 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:01 crc kubenswrapper[4694]: I0217 17:39:01.018650 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkn6c\" (UniqueName: \"kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:01 crc kubenswrapper[4694]: I0217 17:39:01.018709 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:01 crc kubenswrapper[4694]: I0217 17:39:01.019561 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:01 crc kubenswrapper[4694]: I0217 17:39:01.039343 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkn6c\" (UniqueName: \"kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c\") pod \"crc-debug-pcmbs\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:01 crc kubenswrapper[4694]: I0217 17:39:01.180384 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:02 crc kubenswrapper[4694]: I0217 17:39:02.059696 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" event={"ID":"ff5567ba-0638-459c-9c08-2a8861d75958","Type":"ContainerStarted","Data":"855b8ead7aed35f4fe00d74fc0e3046ad668cbdf8b3c96e8bbeee2cef096c9ba"} Feb 17 17:39:13 crc kubenswrapper[4694]: I0217 17:39:13.156779 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" event={"ID":"ff5567ba-0638-459c-9c08-2a8861d75958","Type":"ContainerStarted","Data":"bacc1a5efafed37d82e4fb61794e376467f38113e2905c286f9a455fe615881c"} Feb 17 17:39:13 crc kubenswrapper[4694]: I0217 17:39:13.173302 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" podStartSLOduration=1.859633139 podStartE2EDuration="13.173276828s" podCreationTimestamp="2026-02-17 17:39:00 +0000 UTC" firstStartedPulling="2026-02-17 17:39:01.220376058 +0000 UTC m=+3408.977451382" lastFinishedPulling="2026-02-17 17:39:12.534019747 +0000 UTC m=+3420.291095071" observedRunningTime="2026-02-17 17:39:13.171934271 +0000 UTC m=+3420.929009595" watchObservedRunningTime="2026-02-17 17:39:13.173276828 +0000 UTC m=+3420.930352152" Feb 17 17:39:14 crc kubenswrapper[4694]: I0217 17:39:14.617768 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:39:14 crc kubenswrapper[4694]: I0217 17:39:14.618087 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.100452 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.104466 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.116390 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.170664 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.170804 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.170886 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjf2\" (UniqueName: \"kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.273817 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.273926 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjf2\" (UniqueName: \"kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.274189 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.274562 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.275022 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.303515 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjf2\" (UniqueName: \"kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2\") pod \"certified-operators-zx4r8\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.434263 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:19 crc kubenswrapper[4694]: I0217 17:39:19.999135 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:20 crc kubenswrapper[4694]: W0217 17:39:20.014984 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91fe3db_dbf3_4527_96f4_d9063d81e826.slice/crio-419f4a1109becc1d75f0b34642185dff627e4c70dee8daa005eb3b4f530329bf WatchSource:0}: Error finding container 419f4a1109becc1d75f0b34642185dff627e4c70dee8daa005eb3b4f530329bf: Status 404 returned error can't find the container with id 419f4a1109becc1d75f0b34642185dff627e4c70dee8daa005eb3b4f530329bf Feb 17 17:39:20 crc kubenswrapper[4694]: I0217 17:39:20.218769 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerStarted","Data":"419f4a1109becc1d75f0b34642185dff627e4c70dee8daa005eb3b4f530329bf"} Feb 17 17:39:21 crc kubenswrapper[4694]: I0217 17:39:21.228591 4694 generic.go:334] "Generic (PLEG): container finished" podID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerID="ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a" exitCode=0 Feb 17 17:39:21 crc kubenswrapper[4694]: I0217 17:39:21.228646 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerDied","Data":"ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a"} Feb 17 17:39:22 crc kubenswrapper[4694]: I0217 17:39:22.239735 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerStarted","Data":"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1"} Feb 17 17:39:23 crc kubenswrapper[4694]: I0217 17:39:23.270090 4694 generic.go:334] "Generic (PLEG): container finished" podID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerID="5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1" exitCode=0 Feb 17 17:39:23 crc kubenswrapper[4694]: I0217 17:39:23.270693 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerDied","Data":"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1"} Feb 17 17:39:26 crc kubenswrapper[4694]: I0217 17:39:26.303816 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerStarted","Data":"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e"} Feb 17 17:39:26 crc kubenswrapper[4694]: I0217 17:39:26.330663 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zx4r8" podStartSLOduration=4.845922478 podStartE2EDuration="7.330642782s" podCreationTimestamp="2026-02-17 17:39:19 +0000 UTC" firstStartedPulling="2026-02-17 17:39:21.23299787 +0000 UTC m=+3428.990073194" lastFinishedPulling="2026-02-17 17:39:23.717718174 +0000 UTC m=+3431.474793498" observedRunningTime="2026-02-17 17:39:26.326544078 +0000 UTC m=+3434.083619402" watchObservedRunningTime="2026-02-17 17:39:26.330642782 +0000 UTC m=+3434.087718106" Feb 17 17:39:29 crc kubenswrapper[4694]: I0217 17:39:29.434727 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:29 crc kubenswrapper[4694]: I0217 17:39:29.435305 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:29 crc kubenswrapper[4694]: I0217 17:39:29.485480 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:30 crc kubenswrapper[4694]: I0217 17:39:30.413161 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:30 crc kubenswrapper[4694]: I0217 17:39:30.467280 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.374011 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zx4r8" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="registry-server" containerID="cri-o://2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e" gracePeriod=2 Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.855919 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.929572 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities\") pod \"c91fe3db-dbf3-4527-96f4-d9063d81e826\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.929696 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content\") pod \"c91fe3db-dbf3-4527-96f4-d9063d81e826\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.929793 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjf2\" (UniqueName: \"kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2\") pod \"c91fe3db-dbf3-4527-96f4-d9063d81e826\" (UID: \"c91fe3db-dbf3-4527-96f4-d9063d81e826\") " Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.930821 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities" (OuterVolumeSpecName: "utilities") pod "c91fe3db-dbf3-4527-96f4-d9063d81e826" (UID: "c91fe3db-dbf3-4527-96f4-d9063d81e826"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.937724 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2" (OuterVolumeSpecName: "kube-api-access-8sjf2") pod "c91fe3db-dbf3-4527-96f4-d9063d81e826" (UID: "c91fe3db-dbf3-4527-96f4-d9063d81e826"). InnerVolumeSpecName "kube-api-access-8sjf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:39:32 crc kubenswrapper[4694]: I0217 17:39:32.984794 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c91fe3db-dbf3-4527-96f4-d9063d81e826" (UID: "c91fe3db-dbf3-4527-96f4-d9063d81e826"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.031466 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjf2\" (UniqueName: \"kubernetes.io/projected/c91fe3db-dbf3-4527-96f4-d9063d81e826-kube-api-access-8sjf2\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.031510 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.031521 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91fe3db-dbf3-4527-96f4-d9063d81e826-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.384059 4694 generic.go:334] "Generic (PLEG): container finished" podID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerID="2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e" exitCode=0 Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.384107 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerDied","Data":"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e"} Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.384134 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx4r8" event={"ID":"c91fe3db-dbf3-4527-96f4-d9063d81e826","Type":"ContainerDied","Data":"419f4a1109becc1d75f0b34642185dff627e4c70dee8daa005eb3b4f530329bf"} Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.384131 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx4r8" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.384170 4694 scope.go:117] "RemoveContainer" containerID="2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.405239 4694 scope.go:117] "RemoveContainer" containerID="5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.419382 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.436831 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zx4r8"] Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.440774 4694 scope.go:117] "RemoveContainer" containerID="ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.477785 4694 scope.go:117] "RemoveContainer" containerID="2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e" Feb 17 17:39:33 crc kubenswrapper[4694]: E0217 17:39:33.478365 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e\": container with ID starting with 2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e not found: ID does not exist" containerID="2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.478403 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e"} err="failed to get container status \"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e\": rpc error: code = NotFound desc = could not find container \"2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e\": container with ID starting with 2003822a90d6cfe8894ba179263bb0cf687293372a7ecd74eecef4b2522e504e not found: ID does not exist" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.478428 4694 scope.go:117] "RemoveContainer" containerID="5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1" Feb 17 17:39:33 crc kubenswrapper[4694]: E0217 17:39:33.478750 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1\": container with ID starting with 5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1 not found: ID does not exist" containerID="5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.478771 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1"} err="failed to get container status \"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1\": rpc error: code = NotFound desc = could not find container \"5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1\": container with ID starting with 5b92baf07a5197be07b1132e378fb9f2fbe6e139e1b299849e4dc026087225f1 not found: ID does not exist" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.478784 4694 scope.go:117] "RemoveContainer" containerID="ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a" Feb 17 17:39:33 crc kubenswrapper[4694]: E0217 17:39:33.479105 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a\": container with ID starting with ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a not found: ID does not exist" containerID="ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a" Feb 17 17:39:33 crc kubenswrapper[4694]: I0217 17:39:33.479128 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a"} err="failed to get container status \"ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a\": rpc error: code = NotFound desc = could not find container \"ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a\": container with ID starting with ec7bf21bec1bfdedd631e825e0da1837565636841d1e431e992c0444c939e56a not found: ID does not exist" Feb 17 17:39:34 crc kubenswrapper[4694]: I0217 17:39:34.907904 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" path="/var/lib/kubelet/pods/c91fe3db-dbf3-4527-96f4-d9063d81e826/volumes" Feb 17 17:39:44 crc kubenswrapper[4694]: I0217 17:39:44.618270 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:39:44 crc kubenswrapper[4694]: I0217 17:39:44.618852 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:39:50 crc kubenswrapper[4694]: I0217 17:39:50.532842 4694 generic.go:334] "Generic (PLEG): container finished" podID="ff5567ba-0638-459c-9c08-2a8861d75958" containerID="bacc1a5efafed37d82e4fb61794e376467f38113e2905c286f9a455fe615881c" exitCode=0 Feb 17 17:39:50 crc kubenswrapper[4694]: I0217 17:39:50.532955 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" event={"ID":"ff5567ba-0638-459c-9c08-2a8861d75958","Type":"ContainerDied","Data":"bacc1a5efafed37d82e4fb61794e376467f38113e2905c286f9a455fe615881c"} Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.644147 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.680952 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-pcmbs"] Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.689183 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-pcmbs"] Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.700926 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkn6c\" (UniqueName: \"kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c\") pod \"ff5567ba-0638-459c-9c08-2a8861d75958\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.701028 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host\") pod \"ff5567ba-0638-459c-9c08-2a8861d75958\" (UID: \"ff5567ba-0638-459c-9c08-2a8861d75958\") " Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.701176 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host" (OuterVolumeSpecName: "host") pod "ff5567ba-0638-459c-9c08-2a8861d75958" (UID: "ff5567ba-0638-459c-9c08-2a8861d75958"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.701754 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5567ba-0638-459c-9c08-2a8861d75958-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.711434 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c" (OuterVolumeSpecName: "kube-api-access-fkn6c") pod "ff5567ba-0638-459c-9c08-2a8861d75958" (UID: "ff5567ba-0638-459c-9c08-2a8861d75958"). InnerVolumeSpecName "kube-api-access-fkn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:39:51 crc kubenswrapper[4694]: I0217 17:39:51.803370 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkn6c\" (UniqueName: \"kubernetes.io/projected/ff5567ba-0638-459c-9c08-2a8861d75958-kube-api-access-fkn6c\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.554324 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855b8ead7aed35f4fe00d74fc0e3046ad668cbdf8b3c96e8bbeee2cef096c9ba" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.554416 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-pcmbs" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.824867 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-t9pz2"] Feb 17 17:39:52 crc kubenswrapper[4694]: E0217 17:39:52.825269 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="extract-utilities" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825284 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="extract-utilities" Feb 17 17:39:52 crc kubenswrapper[4694]: E0217 17:39:52.825312 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="extract-content" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825319 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="extract-content" Feb 17 17:39:52 crc kubenswrapper[4694]: E0217 17:39:52.825337 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="registry-server" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825344 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="registry-server" Feb 17 17:39:52 crc kubenswrapper[4694]: E0217 17:39:52.825362 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5567ba-0638-459c-9c08-2a8861d75958" containerName="container-00" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825371 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5567ba-0638-459c-9c08-2a8861d75958" containerName="container-00" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825576 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5567ba-0638-459c-9c08-2a8861d75958" containerName="container-00" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.825619 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91fe3db-dbf3-4527-96f4-d9063d81e826" containerName="registry-server" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.826315 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.905115 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5567ba-0638-459c-9c08-2a8861d75958" path="/var/lib/kubelet/pods/ff5567ba-0638-459c-9c08-2a8861d75958/volumes" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.936759 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2jk\" (UniqueName: \"kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:52 crc kubenswrapper[4694]: I0217 17:39:52.936890 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.039307 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2jk\" (UniqueName: \"kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.039444 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.039530 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.061648 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2jk\" (UniqueName: \"kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk\") pod \"crc-debug-t9pz2\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.151961 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.566594 4694 generic.go:334] "Generic (PLEG): container finished" podID="8862bf4c-e49b-468c-a169-ce3f89b8928c" containerID="76be9ef87d5abcc1e81753d29d313a045e21462839463988b60a4fb7deff5440" exitCode=0 Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.566644 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" event={"ID":"8862bf4c-e49b-468c-a169-ce3f89b8928c","Type":"ContainerDied","Data":"76be9ef87d5abcc1e81753d29d313a045e21462839463988b60a4fb7deff5440"} Feb 17 17:39:53 crc kubenswrapper[4694]: I0217 17:39:53.566952 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" event={"ID":"8862bf4c-e49b-468c-a169-ce3f89b8928c","Type":"ContainerStarted","Data":"44357b18c7518a484dfe9e979a2b67f9c4ca2ab71b7e93d966a1cc4dcc8f6485"} Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.087060 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-t9pz2"] Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.093398 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-t9pz2"] Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.708362 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.770727 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt2jk\" (UniqueName: \"kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk\") pod \"8862bf4c-e49b-468c-a169-ce3f89b8928c\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.771167 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host\") pod \"8862bf4c-e49b-468c-a169-ce3f89b8928c\" (UID: \"8862bf4c-e49b-468c-a169-ce3f89b8928c\") " Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.771278 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host" (OuterVolumeSpecName: "host") pod "8862bf4c-e49b-468c-a169-ce3f89b8928c" (UID: "8862bf4c-e49b-468c-a169-ce3f89b8928c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.771709 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8862bf4c-e49b-468c-a169-ce3f89b8928c-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.776576 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk" (OuterVolumeSpecName: "kube-api-access-dt2jk") pod "8862bf4c-e49b-468c-a169-ce3f89b8928c" (UID: "8862bf4c-e49b-468c-a169-ce3f89b8928c"). InnerVolumeSpecName "kube-api-access-dt2jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.875291 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt2jk\" (UniqueName: \"kubernetes.io/projected/8862bf4c-e49b-468c-a169-ce3f89b8928c-kube-api-access-dt2jk\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:54 crc kubenswrapper[4694]: I0217 17:39:54.917313 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8862bf4c-e49b-468c-a169-ce3f89b8928c" path="/var/lib/kubelet/pods/8862bf4c-e49b-468c-a169-ce3f89b8928c/volumes" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.287543 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-4kbhq"] Feb 17 17:39:55 crc kubenswrapper[4694]: E0217 17:39:55.288483 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8862bf4c-e49b-468c-a169-ce3f89b8928c" containerName="container-00" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.288497 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8862bf4c-e49b-468c-a169-ce3f89b8928c" containerName="container-00" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.288749 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8862bf4c-e49b-468c-a169-ce3f89b8928c" containerName="container-00" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.289751 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.384346 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.384735 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7vcd\" (UniqueName: \"kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.487048 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7vcd\" (UniqueName: \"kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.487206 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.487409 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.506437 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7vcd\" (UniqueName: \"kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd\") pod \"crc-debug-4kbhq\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.597592 4694 scope.go:117] "RemoveContainer" containerID="76be9ef87d5abcc1e81753d29d313a045e21462839463988b60a4fb7deff5440" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.597750 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-t9pz2" Feb 17 17:39:55 crc kubenswrapper[4694]: I0217 17:39:55.618395 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:55 crc kubenswrapper[4694]: W0217 17:39:55.650510 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0b367e_8835_4ffb_9349_d5059644e6cb.slice/crio-e36a4d47b949e59edf23e93e0b4a2e8928dc7750b73ce1a8db1512c5a6f5ce05 WatchSource:0}: Error finding container e36a4d47b949e59edf23e93e0b4a2e8928dc7750b73ce1a8db1512c5a6f5ce05: Status 404 returned error can't find the container with id e36a4d47b949e59edf23e93e0b4a2e8928dc7750b73ce1a8db1512c5a6f5ce05 Feb 17 17:39:56 crc kubenswrapper[4694]: I0217 17:39:56.607945 4694 generic.go:334] "Generic (PLEG): container finished" podID="8f0b367e-8835-4ffb-9349-d5059644e6cb" containerID="9c396b6ab8b3338a8cb2b272defec7fcac8e22936f67882f2e89407aa8fa646f" exitCode=0 Feb 17 17:39:56 crc kubenswrapper[4694]: I0217 17:39:56.608005 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" event={"ID":"8f0b367e-8835-4ffb-9349-d5059644e6cb","Type":"ContainerDied","Data":"9c396b6ab8b3338a8cb2b272defec7fcac8e22936f67882f2e89407aa8fa646f"} Feb 17 17:39:56 crc kubenswrapper[4694]: I0217 17:39:56.608038 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" event={"ID":"8f0b367e-8835-4ffb-9349-d5059644e6cb","Type":"ContainerStarted","Data":"e36a4d47b949e59edf23e93e0b4a2e8928dc7750b73ce1a8db1512c5a6f5ce05"} Feb 17 17:39:56 crc kubenswrapper[4694]: I0217 17:39:56.653016 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-4kbhq"] Feb 17 17:39:56 crc kubenswrapper[4694]: I0217 17:39:56.665031 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p2hq/crc-debug-4kbhq"] Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.718003 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.846557 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7vcd\" (UniqueName: \"kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd\") pod \"8f0b367e-8835-4ffb-9349-d5059644e6cb\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.846646 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host\") pod \"8f0b367e-8835-4ffb-9349-d5059644e6cb\" (UID: \"8f0b367e-8835-4ffb-9349-d5059644e6cb\") " Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.846762 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host" (OuterVolumeSpecName: "host") pod "8f0b367e-8835-4ffb-9349-d5059644e6cb" (UID: "8f0b367e-8835-4ffb-9349-d5059644e6cb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.847237 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f0b367e-8835-4ffb-9349-d5059644e6cb-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.852197 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd" (OuterVolumeSpecName: "kube-api-access-f7vcd") pod "8f0b367e-8835-4ffb-9349-d5059644e6cb" (UID: "8f0b367e-8835-4ffb-9349-d5059644e6cb"). InnerVolumeSpecName "kube-api-access-f7vcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:39:57 crc kubenswrapper[4694]: I0217 17:39:57.950383 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7vcd\" (UniqueName: \"kubernetes.io/projected/8f0b367e-8835-4ffb-9349-d5059644e6cb-kube-api-access-f7vcd\") on node \"crc\" DevicePath \"\"" Feb 17 17:39:58 crc kubenswrapper[4694]: I0217 17:39:58.625359 4694 scope.go:117] "RemoveContainer" containerID="9c396b6ab8b3338a8cb2b272defec7fcac8e22936f67882f2e89407aa8fa646f" Feb 17 17:39:58 crc kubenswrapper[4694]: I0217 17:39:58.625400 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/crc-debug-4kbhq" Feb 17 17:39:58 crc kubenswrapper[4694]: I0217 17:39:58.905752 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0b367e-8835-4ffb-9349-d5059644e6cb" path="/var/lib/kubelet/pods/8f0b367e-8835-4ffb-9349-d5059644e6cb/volumes" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.323294 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cff554946-ddg9w_1843aa1a-460a-42ec-adb2-b20b48c71a90/barbican-api/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.517698 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cff554946-ddg9w_1843aa1a-460a-42ec-adb2-b20b48c71a90/barbican-api-log/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.521202 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694d8d5c8-nq2bp_d79bfe1a-e161-41b0-8eed-0f1879b1f990/barbican-keystone-listener/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.572541 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694d8d5c8-nq2bp_d79bfe1a-e161-41b0-8eed-0f1879b1f990/barbican-keystone-listener-log/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.710126 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b65b64c9-dlmdm_0470d53c-a76c-4cf3-8f95-1ae293182645/barbican-worker/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.741906 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b65b64c9-dlmdm_0470d53c-a76c-4cf3-8f95-1ae293182645/barbican-worker-log/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.917840 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5_c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:12 crc kubenswrapper[4694]: I0217 17:40:12.967088 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/ceilometer-central-agent/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.021293 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/ceilometer-notification-agent/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.089697 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/proxy-httpd/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.126437 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/sg-core/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.241676 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e5767e0e-627e-4e5a-9ee9-c150b7bc2d72/cinder-api/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.274291 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e5767e0e-627e-4e5a-9ee9-c150b7bc2d72/cinder-api-log/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.440963 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_83a8e274-5312-4be8-9f81-c7b13a2effe1/cinder-scheduler/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.486460 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_83a8e274-5312-4be8-9f81-c7b13a2effe1/probe/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.657277 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4_af1e17e7-cd69-4f0f-8e3f-e36399e001a8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.699874 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c_3b2a9feb-de71-42ff-b0ae-f4697f525469/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:13 crc kubenswrapper[4694]: I0217 17:40:13.868798 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/init/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.039099 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/init/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.056099 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/dnsmasq-dns/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.056530 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lbchr_ef5a14b7-490f-48b6-a150-6437a2a18fda/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.216126 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b48a41b5-4a74-4883-a067-660e674ceecb/glance-httpd/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.239104 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b48a41b5-4a74-4883-a067-660e674ceecb/glance-log/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.389901 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e061fc0e-dd5f-429f-8275-0a744dfc846d/glance-log/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.451364 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e061fc0e-dd5f-429f-8275-0a744dfc846d/glance-httpd/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.524615 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b8f4f9856-rcwl9_17711b82-3f49-41da-b17d-785c70869492/horizon/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.617410 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.617461 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.617506 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.618267 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.618331 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777" gracePeriod=600 Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.680145 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5_fc2a86d9-61f8-4af3-9835-2aeea9736b84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.854250 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b8f4f9856-rcwl9_17711b82-3f49-41da-b17d-785c70869492/horizon-log/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.904817 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rwrb8_a74ac9aa-ac88-4b13-b10b-9fe0f9195f35/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.928895 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777" exitCode=0 Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.928969 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777"} Feb 17 17:40:14 crc kubenswrapper[4694]: I0217 17:40:14.929201 4694 scope.go:117] "RemoveContainer" containerID="d988849c3eeb22950448ea45f187ac89f76096e318d080ef853c5719950f9e62" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.163860 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8667649c99-28rzh_b0aae110-6e5c-4f32-95d9-b4b3429ca622/keystone-api/0.log" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.301712 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_eeee4610-5faa-46a3-815b-2b04150c9abf/kube-state-metrics/0.log" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.491977 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh_032fed18-d394-4743-ac9d-efa8d472bbc2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.856481 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-699759854f-bj949_c626c95e-85d3-4ba2-8453-060b57d2ca05/neutron-httpd/0.log" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.941752 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-699759854f-bj949_c626c95e-85d3-4ba2-8453-060b57d2ca05/neutron-api/0.log" Feb 17 17:40:15 crc kubenswrapper[4694]: I0217 17:40:15.944701 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e"} Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.058245 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4_55da2ff6-efab-4cee-acb9-b0a04edc8980/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.558577 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8083aa6-ef30-42ca-b979-e21a0697ce79/nova-api-log/0.log" Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.566838 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6d7b20d4-ec67-4732-bb23-97f5dacf1af1/nova-cell0-conductor-conductor/0.log" Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.720187 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8083aa6-ef30-42ca-b979-e21a0697ce79/nova-api-api/0.log" Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.891850 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_91495551-d244-497c-b8f1-376b3206a3aa/nova-cell1-conductor-conductor/0.log" Feb 17 17:40:16 crc kubenswrapper[4694]: I0217 17:40:16.968880 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9894d581-eaea-45f8-a4ca-1a73c9fc778b/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.129110 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ddccg_f97bc145-9375-4a33-8b64-699355feb0fd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.264128 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f396879b-c24b-478f-b98f-24347a13a36d/nova-metadata-log/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.497586 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_204bee86-a2fa-4fb3-bb90-60f67cb66bc7/nova-scheduler-scheduler/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.565117 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/mysql-bootstrap/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.718673 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/mysql-bootstrap/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.802770 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/galera/0.log" Feb 17 17:40:17 crc kubenswrapper[4694]: I0217 17:40:17.954992 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/mysql-bootstrap/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.150775 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/galera/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.155577 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/mysql-bootstrap/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.257095 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f396879b-c24b-478f-b98f-24347a13a36d/nova-metadata-metadata/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.366005 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xtmfm_e1c89368-7041-4c84-8ba6-624d0f0b695e/openstack-network-exporter/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.389757 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_30281bdf-b35e-4ac1-8cde-8a333e24f564/openstackclient/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.573064 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server-init/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.756026 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server-init/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.757350 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.773059 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovs-vswitchd/0.log" Feb 17 17:40:18 crc kubenswrapper[4694]: I0217 17:40:18.969485 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-stczv_45514b0e-57f3-494a-823a-2a0f0c2f728d/ovn-controller/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.065888 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rwf25_a7d24b5a-8b19-4532-a8ca-b34ebad591d1/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.172813 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10287785-ab79-4801-903b-0b4acdc8aca8/openstack-network-exporter/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.193062 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10287785-ab79-4801-903b-0b4acdc8aca8/ovn-northd/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.335305 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf651b7-0b06-4b95-916e-e7fe6630d272/openstack-network-exporter/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.387489 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf651b7-0b06-4b95-916e-e7fe6630d272/ovsdbserver-nb/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.503791 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_391b1870-6558-40d0-be12-d31b3a57ed32/openstack-network-exporter/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.570504 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_391b1870-6558-40d0-be12-d31b3a57ed32/ovsdbserver-sb/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.786890 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7569cc6b-bv5js_1a8ff002-04b9-4dfa-af27-36823f7918a8/placement-api/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.790571 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7569cc6b-bv5js_1a8ff002-04b9-4dfa-af27-36823f7918a8/placement-log/0.log" Feb 17 17:40:19 crc kubenswrapper[4694]: I0217 17:40:19.876894 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/setup-container/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.046245 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/setup-container/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.135181 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/setup-container/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.167359 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/rabbitmq/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.353467 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/setup-container/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.375516 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s_fe2671a8-04cd-4b09-ba6a-e6250762985e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.459697 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/rabbitmq/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.585373 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-d487f_45da847c-7705-408d-a3c4-b05253e15d3f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.671570 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj_ef784368-4cf0-42fb-b4c5-b5ca19fe472a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.824409 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wfs8m_f7ab42be-d837-4b1d-8d80-164f92fc205a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:20 crc kubenswrapper[4694]: I0217 17:40:20.929102 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vsfcq_4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a/ssh-known-hosts-edpm-deployment/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.086008 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f54df747-vdnkk_c0f37d92-d923-43f2-807f-d52cd9003a2c/proxy-server/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.152663 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f54df747-vdnkk_c0f37d92-d923-43f2-807f-d52cd9003a2c/proxy-httpd/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.299631 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-q6jc9_78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3/swift-ring-rebalance/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.331703 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-auditor/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.433054 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-reaper/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.526896 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-server/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.563749 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-replicator/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.731728 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-auditor/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.834228 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-server/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.881464 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-replicator/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.913368 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-updater/0.log" Feb 17 17:40:21 crc kubenswrapper[4694]: I0217 17:40:21.956540 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-auditor/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.071683 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-expirer/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.080946 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-replicator/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.126137 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-server/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.190488 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-updater/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.278237 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/swift-recon-cron/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.286147 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/rsync/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.499707 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj_fe57a3c1-260f-4f46-b977-82656c0ad9d6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.513506 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5a4a02dc-9cc2-4445-9624-359734b69ae6/tempest-tests-tempest-tests-runner/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.661169 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6cf976c8-b5ae-4b82-ad19-5d28b6196b80/test-operator-logs-container/0.log" Feb 17 17:40:22 crc kubenswrapper[4694]: I0217 17:40:22.798705 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv_bfbc588a-92ae-49ee-bcad-433dc28ecad7/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:40:29 crc kubenswrapper[4694]: I0217 17:40:29.583090 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_11f9dba1-ca35-4d40-b07b-44a141b8a80b/memcached/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.170700 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.399930 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.450546 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.456392 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.651519 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.658025 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:40:48 crc kubenswrapper[4694]: I0217 17:40:48.696060 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/extract/0.log" Feb 17 17:40:49 crc kubenswrapper[4694]: I0217 17:40:49.067514 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-dkblx_6df3483a-cb85-4da5-a314-e0aea8874af8/manager/0.log" Feb 17 17:40:49 crc kubenswrapper[4694]: I0217 17:40:49.412973 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-pkrtr_4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd/manager/0.log" Feb 17 17:40:49 crc kubenswrapper[4694]: I0217 17:40:49.672341 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-cc8kn_17fc1c20-dd01-4529-99a2-5a758dd7d8f1/manager/0.log" Feb 17 17:40:49 crc kubenswrapper[4694]: I0217 17:40:49.911798 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-5s57w_9690ac41-b444-4d39-adfb-de4dd1b4d581/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.220889 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-fk2xd_b43c51f9-625e-4499-bdcc-612213a353df/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.294389 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-5zb28_73478819-b2d9-484f-8f31-12636ce0fff1/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.527321 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-f8krc_37924017-8b0a-4920-becb-89e528139e25/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.587233 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-wj245_42a9eede-5ff2-40da-9491-44a8012320a2/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.770735 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-4qw8g_df099335-bb95-4e69-9628-9f53a170b043/manager/0.log" Feb 17 17:40:50 crc kubenswrapper[4694]: I0217 17:40:50.952063 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-z7ldj_80223b5f-ef20-4fc6-b4bc-b8d63046db41/manager/0.log" Feb 17 17:40:51 crc kubenswrapper[4694]: I0217 17:40:51.121983 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-kqwxc_4ab6ce1a-7a26-4b71-be84-b27da7acf5c4/manager/0.log" Feb 17 17:40:51 crc kubenswrapper[4694]: I0217 17:40:51.349329 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-8zxvf_c97c1c8a-6504-4db2-ad45-0a0c2f84551f/manager/0.log" Feb 17 17:40:51 crc kubenswrapper[4694]: I0217 17:40:51.529878 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz_c9a788f4-d0b6-4275-9b3c-33f39fe70178/manager/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.023258 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5f8bcb546f-d72cj_8096d5be-2884-4a45-839b-1b2b20bc116d/operator/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.257943 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hlsbt_6022b0aa-9d87-47ae-8e99-4f71ef252803/registry-server/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.502920 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-pj5f2_db08b317-d51b-471e-a235-2c9b4cd1f6f7/manager/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.704145 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-xgzng_3629dece-1e4c-40cf-bb56-d15d6ca8aa44/manager/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.793270 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-m2ntk_26d16927-5afd-4da9-a66a-c20006f1d9e7/manager/0.log" Feb 17 17:40:52 crc kubenswrapper[4694]: I0217 17:40:52.934010 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ggsvh_a65ffd3d-87f5-4491-b71e-9823c314bc1f/operator/0.log" Feb 17 17:40:53 crc kubenswrapper[4694]: I0217 17:40:53.185943 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-xgh9m_7eee970c-7e73-4420-adc3-331ee21c914c/manager/0.log" Feb 17 17:40:53 crc kubenswrapper[4694]: I0217 17:40:53.355148 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-w98pl_ad0948c6-64ee-47a8-b5da-aaa3b8f051ef/manager/0.log" Feb 17 17:40:53 crc kubenswrapper[4694]: I0217 17:40:53.372407 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-7285z_97487951-8888-4a3b-91fc-76d324fdf255/manager/0.log" Feb 17 17:40:53 crc kubenswrapper[4694]: I0217 17:40:53.643458 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-ch6kt_1e4d1433-71e3-4cc3-8873-1fa5cd78e961/manager/0.log" Feb 17 17:40:53 crc kubenswrapper[4694]: I0217 17:40:53.653560 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dbd849dbc-2qx8n_607606da-e993-49b1-98ff-a2e0c2146f8a/manager/0.log" Feb 17 17:40:54 crc kubenswrapper[4694]: I0217 17:40:54.718817 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-xc6sk_77c2abe7-0f53-4b11-932c-1f767a6d21b2/manager/0.log" Feb 17 17:41:11 crc kubenswrapper[4694]: I0217 17:41:11.572134 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zjbh5_cbc2bd3c-fb82-4835-9103-f7bf30e51f17/control-plane-machine-set-operator/0.log" Feb 17 17:41:11 crc kubenswrapper[4694]: I0217 17:41:11.671980 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t44rx_5733b257-6fe2-4df1-aa83-4eaf3a84fdcc/kube-rbac-proxy/0.log" Feb 17 17:41:11 crc kubenswrapper[4694]: I0217 17:41:11.757340 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t44rx_5733b257-6fe2-4df1-aa83-4eaf3a84fdcc/machine-api-operator/0.log" Feb 17 17:41:23 crc kubenswrapper[4694]: I0217 17:41:23.738807 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-k7pjr_a4ed5c6f-777c-4e48-acc5-335e03efbe15/cert-manager-controller/0.log" Feb 17 17:41:23 crc kubenswrapper[4694]: I0217 17:41:23.884326 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-45nhc_56fc387f-0f40-4de1-b4f5-628ecdecc25b/cert-manager-cainjector/0.log" Feb 17 17:41:23 crc kubenswrapper[4694]: I0217 17:41:23.963951 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c29mn_da727533-6298-432b-9048-f78ff8faad8f/cert-manager-webhook/0.log" Feb 17 17:41:35 crc kubenswrapper[4694]: I0217 17:41:35.817303 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-6xx9x_9f3c6b99-3db1-447f-b31c-1692a70ec415/nmstate-console-plugin/0.log" Feb 17 17:41:36 crc kubenswrapper[4694]: I0217 17:41:36.022333 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-t7txz_13812f03-334f-44fa-9c5c-c5d257756b27/nmstate-handler/0.log" Feb 17 17:41:36 crc kubenswrapper[4694]: I0217 17:41:36.045334 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-jsvrg_e83663da-470f-4ebf-ac6b-64612e8724f4/kube-rbac-proxy/0.log" Feb 17 17:41:36 crc kubenswrapper[4694]: I0217 17:41:36.104971 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-jsvrg_e83663da-470f-4ebf-ac6b-64612e8724f4/nmstate-metrics/0.log" Feb 17 17:41:36 crc kubenswrapper[4694]: I0217 17:41:36.201929 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-m65zv_c5fe0950-b548-4cdc-9e9d-c2483a8213d9/nmstate-operator/0.log" Feb 17 17:41:36 crc kubenswrapper[4694]: I0217 17:41:36.267645 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-whp8f_aa8c8617-1bdc-461a-9aea-d534da85b5e4/nmstate-webhook/0.log" Feb 17 17:42:02 crc kubenswrapper[4694]: I0217 17:42:02.977725 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-nhxz7_13962e8e-e444-4010-912e-9c953c8f7b8f/kube-rbac-proxy/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.038247 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-nhxz7_13962e8e-e444-4010-912e-9c953c8f7b8f/controller/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.187121 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.393655 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.405367 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.456740 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.471628 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.712362 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.831466 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.846302 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:42:03 crc kubenswrapper[4694]: I0217 17:42:03.867731 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.010797 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.028540 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.031546 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.066445 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/controller/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.209038 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/kube-rbac-proxy/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.242741 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/frr-metrics/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.286185 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/kube-rbac-proxy-frr/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.415538 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/reloader/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.532945 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-nzwnk_b1d3f052-b387-452c-a154-a4f7cd14a6b7/frr-k8s-webhook-server/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.773390 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d47c4d78b-gffhs_152f177f-c542-4114-9d7b-601185b129b2/manager/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.854067 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7756f55684-8twln_d2d8d1c1-ad28-45c6-8314-935e8c60b976/webhook-server/0.log" Feb 17 17:42:04 crc kubenswrapper[4694]: I0217 17:42:04.994193 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xlw27_e542a8da-05af-4ddd-95dc-cf10576c4658/kube-rbac-proxy/0.log" Feb 17 17:42:05 crc kubenswrapper[4694]: I0217 17:42:05.548491 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/frr/0.log" Feb 17 17:42:05 crc kubenswrapper[4694]: I0217 17:42:05.566437 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xlw27_e542a8da-05af-4ddd-95dc-cf10576c4658/speaker/0.log" Feb 17 17:42:14 crc kubenswrapper[4694]: I0217 17:42:14.618173 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:42:14 crc kubenswrapper[4694]: I0217 17:42:14.618730 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:42:17 crc kubenswrapper[4694]: I0217 17:42:17.749659 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:42:17 crc kubenswrapper[4694]: I0217 17:42:17.958833 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:42:17 crc kubenswrapper[4694]: I0217 17:42:17.966281 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.035851 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.180822 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.198506 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/extract/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.205780 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.374778 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.549578 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.560050 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.575764 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.740308 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:42:18 crc kubenswrapper[4694]: I0217 17:42:18.777807 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.013993 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.136768 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.165297 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/registry-server/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.216854 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.232735 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.437910 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.442984 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.687354 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.890362 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.905411 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.954837 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/registry-server/0.log" Feb 17 17:42:19 crc kubenswrapper[4694]: I0217 17:42:19.966815 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.126478 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.132754 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.134084 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/extract/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.511003 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w82n9_80752637-17b7-451f-a4f9-c15ff9d5bd47/marketplace-operator/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.543946 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.701014 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.709337 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.744679 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.917992 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:42:20 crc kubenswrapper[4694]: I0217 17:42:20.928555 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.064913 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/registry-server/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.133857 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.322519 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.332683 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.339672 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.527242 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:42:21 crc kubenswrapper[4694]: I0217 17:42:21.535220 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:42:22 crc kubenswrapper[4694]: I0217 17:42:22.169261 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/registry-server/0.log" Feb 17 17:42:40 crc kubenswrapper[4694]: E0217 17:42:40.067820 4694 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:40636->38.102.83.75:46257: write tcp 38.102.83.75:40636->38.102.83.75:46257: write: connection reset by peer Feb 17 17:42:44 crc kubenswrapper[4694]: I0217 17:42:44.617707 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:42:44 crc kubenswrapper[4694]: I0217 17:42:44.618633 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:43:14 crc kubenswrapper[4694]: I0217 17:43:14.618169 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:43:14 crc kubenswrapper[4694]: I0217 17:43:14.618705 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:43:14 crc kubenswrapper[4694]: I0217 17:43:14.618741 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:43:14 crc kubenswrapper[4694]: I0217 17:43:14.619657 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:43:14 crc kubenswrapper[4694]: I0217 17:43:14.619715 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" gracePeriod=600 Feb 17 17:43:14 crc kubenswrapper[4694]: E0217 17:43:14.748596 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:43:15 crc kubenswrapper[4694]: I0217 17:43:15.716955 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" exitCode=0 Feb 17 17:43:15 crc kubenswrapper[4694]: I0217 17:43:15.717118 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e"} Feb 17 17:43:15 crc kubenswrapper[4694]: I0217 17:43:15.717239 4694 scope.go:117] "RemoveContainer" containerID="b5b7617690b041aaf5c7c7db2e7efb468ce9049de1ad2bcc576f9ad578fea777" Feb 17 17:43:15 crc kubenswrapper[4694]: I0217 17:43:15.717974 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:43:15 crc kubenswrapper[4694]: E0217 17:43:15.718446 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:43:27 crc kubenswrapper[4694]: I0217 17:43:27.895784 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:43:27 crc kubenswrapper[4694]: E0217 17:43:27.896434 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:43:42 crc kubenswrapper[4694]: I0217 17:43:42.902460 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:43:42 crc kubenswrapper[4694]: E0217 17:43:42.903366 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:43:57 crc kubenswrapper[4694]: I0217 17:43:57.894875 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:43:57 crc kubenswrapper[4694]: E0217 17:43:57.895498 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:43:57 crc kubenswrapper[4694]: I0217 17:43:57.998075 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-9f54df747-vdnkk" podUID="c0f37d92-d923-43f2-807f-d52cd9003a2c" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 17 17:44:10 crc kubenswrapper[4694]: I0217 17:44:10.214430 4694 generic.go:334] "Generic (PLEG): container finished" podID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerID="3222e896501f3ea336bc55bf14cc29a3edd518bb0aa840ddc7cc9a2952e000fc" exitCode=0 Feb 17 17:44:10 crc kubenswrapper[4694]: I0217 17:44:10.214564 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" event={"ID":"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8","Type":"ContainerDied","Data":"3222e896501f3ea336bc55bf14cc29a3edd518bb0aa840ddc7cc9a2952e000fc"} Feb 17 17:44:10 crc kubenswrapper[4694]: I0217 17:44:10.215808 4694 scope.go:117] "RemoveContainer" containerID="3222e896501f3ea336bc55bf14cc29a3edd518bb0aa840ddc7cc9a2952e000fc" Feb 17 17:44:10 crc kubenswrapper[4694]: I0217 17:44:10.584977 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p2hq_must-gather-5sqgk_46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8/gather/0.log" Feb 17 17:44:12 crc kubenswrapper[4694]: E0217 17:44:12.772304 4694 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:54980->38.102.83.75:46257: read tcp 38.102.83.75:54980->38.102.83.75:46257: read: connection reset by peer Feb 17 17:44:12 crc kubenswrapper[4694]: I0217 17:44:12.902911 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:44:12 crc kubenswrapper[4694]: E0217 17:44:12.903906 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:44:17 crc kubenswrapper[4694]: I0217 17:44:17.867226 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p2hq/must-gather-5sqgk"] Feb 17 17:44:17 crc kubenswrapper[4694]: I0217 17:44:17.867910 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="copy" containerID="cri-o://bfee9f0fdc222eb2749abfe06e68fa8e24c211c2c3f9a3f4b916b58f473c6a73" gracePeriod=2 Feb 17 17:44:17 crc kubenswrapper[4694]: I0217 17:44:17.884260 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p2hq/must-gather-5sqgk"] Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.309556 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p2hq_must-gather-5sqgk_46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8/copy/0.log" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.310257 4694 generic.go:334] "Generic (PLEG): container finished" podID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerID="bfee9f0fdc222eb2749abfe06e68fa8e24c211c2c3f9a3f4b916b58f473c6a73" exitCode=143 Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.429049 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p2hq_must-gather-5sqgk_46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8/copy/0.log" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.429515 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.474837 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output\") pod \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.474952 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkg7m\" (UniqueName: \"kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m\") pod \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\" (UID: \"46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8\") " Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.480266 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m" (OuterVolumeSpecName: "kube-api-access-hkg7m") pod "46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" (UID: "46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8"). InnerVolumeSpecName "kube-api-access-hkg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.576925 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkg7m\" (UniqueName: \"kubernetes.io/projected/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-kube-api-access-hkg7m\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.636725 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" (UID: "46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.678227 4694 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 17:44:18 crc kubenswrapper[4694]: I0217 17:44:18.906023 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" path="/var/lib/kubelet/pods/46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8/volumes" Feb 17 17:44:19 crc kubenswrapper[4694]: I0217 17:44:19.320199 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p2hq_must-gather-5sqgk_46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8/copy/0.log" Feb 17 17:44:19 crc kubenswrapper[4694]: I0217 17:44:19.321005 4694 scope.go:117] "RemoveContainer" containerID="bfee9f0fdc222eb2749abfe06e68fa8e24c211c2c3f9a3f4b916b58f473c6a73" Feb 17 17:44:19 crc kubenswrapper[4694]: I0217 17:44:19.321159 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p2hq/must-gather-5sqgk" Feb 17 17:44:19 crc kubenswrapper[4694]: I0217 17:44:19.343863 4694 scope.go:117] "RemoveContainer" containerID="3222e896501f3ea336bc55bf14cc29a3edd518bb0aa840ddc7cc9a2952e000fc" Feb 17 17:44:23 crc kubenswrapper[4694]: I0217 17:44:23.896399 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:44:23 crc kubenswrapper[4694]: E0217 17:44:23.897123 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:44:37 crc kubenswrapper[4694]: I0217 17:44:37.895412 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:44:37 crc kubenswrapper[4694]: E0217 17:44:37.896310 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:44:49 crc kubenswrapper[4694]: I0217 17:44:49.895784 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:44:49 crc kubenswrapper[4694]: E0217 17:44:49.897215 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.170796 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv"] Feb 17 17:45:00 crc kubenswrapper[4694]: E0217 17:45:00.178744 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="copy" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.178785 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="copy" Feb 17 17:45:00 crc kubenswrapper[4694]: E0217 17:45:00.178814 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0b367e-8835-4ffb-9349-d5059644e6cb" containerName="container-00" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.178822 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0b367e-8835-4ffb-9349-d5059644e6cb" containerName="container-00" Feb 17 17:45:00 crc kubenswrapper[4694]: E0217 17:45:00.178849 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="gather" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.178858 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="gather" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.179247 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0b367e-8835-4ffb-9349-d5059644e6cb" containerName="container-00" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.179291 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="copy" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.179300 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="46eb938b-2bfb-4cf0-8dd5-a5b92c186eb8" containerName="gather" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.182123 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.184438 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv"] Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.186081 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.187342 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.289717 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8sn4\" (UniqueName: \"kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.290219 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.290376 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.392306 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.392807 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.393050 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8sn4\" (UniqueName: \"kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.393277 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.402090 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.410732 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8sn4\" (UniqueName: \"kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4\") pod \"collect-profiles-29522505-72vqv\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.505725 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.896669 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:45:00 crc kubenswrapper[4694]: E0217 17:45:00.897175 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:45:00 crc kubenswrapper[4694]: I0217 17:45:00.946922 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv"] Feb 17 17:45:02 crc kubenswrapper[4694]: I0217 17:45:02.248838 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" event={"ID":"8928a6e5-597a-42c5-9f46-8ff5192f5b65","Type":"ContainerStarted","Data":"3cf65599abe4dbc252d89765e391468634c28fe8b2d608407091fdf16f91af81"} Feb 17 17:45:02 crc kubenswrapper[4694]: I0217 17:45:02.249150 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" event={"ID":"8928a6e5-597a-42c5-9f46-8ff5192f5b65","Type":"ContainerStarted","Data":"04d62ef7ca97cb02e34cbacd74c9f8eb44bcd0e757dc45348be7c9d042938d1b"} Feb 17 17:45:02 crc kubenswrapper[4694]: I0217 17:45:02.275949 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" podStartSLOduration=2.275928872 podStartE2EDuration="2.275928872s" podCreationTimestamp="2026-02-17 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:45:02.27065144 +0000 UTC m=+3770.027726774" watchObservedRunningTime="2026-02-17 17:45:02.275928872 +0000 UTC m=+3770.033004196" Feb 17 17:45:03 crc kubenswrapper[4694]: I0217 17:45:03.260432 4694 generic.go:334] "Generic (PLEG): container finished" podID="8928a6e5-597a-42c5-9f46-8ff5192f5b65" containerID="3cf65599abe4dbc252d89765e391468634c28fe8b2d608407091fdf16f91af81" exitCode=0 Feb 17 17:45:03 crc kubenswrapper[4694]: I0217 17:45:03.260514 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" event={"ID":"8928a6e5-597a-42c5-9f46-8ff5192f5b65","Type":"ContainerDied","Data":"3cf65599abe4dbc252d89765e391468634c28fe8b2d608407091fdf16f91af81"} Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.591804 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.670754 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume\") pod \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.670791 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume\") pod \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.670881 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8sn4\" (UniqueName: \"kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4\") pod \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\" (UID: \"8928a6e5-597a-42c5-9f46-8ff5192f5b65\") " Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.671674 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume" (OuterVolumeSpecName: "config-volume") pod "8928a6e5-597a-42c5-9f46-8ff5192f5b65" (UID: "8928a6e5-597a-42c5-9f46-8ff5192f5b65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.675784 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8928a6e5-597a-42c5-9f46-8ff5192f5b65" (UID: "8928a6e5-597a-42c5-9f46-8ff5192f5b65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.675833 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4" (OuterVolumeSpecName: "kube-api-access-s8sn4") pod "8928a6e5-597a-42c5-9f46-8ff5192f5b65" (UID: "8928a6e5-597a-42c5-9f46-8ff5192f5b65"). InnerVolumeSpecName "kube-api-access-s8sn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.772586 4694 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8928a6e5-597a-42c5-9f46-8ff5192f5b65-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.772644 4694 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8928a6e5-597a-42c5-9f46-8ff5192f5b65-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 17:45:04 crc kubenswrapper[4694]: I0217 17:45:04.772661 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8sn4\" (UniqueName: \"kubernetes.io/projected/8928a6e5-597a-42c5-9f46-8ff5192f5b65-kube-api-access-s8sn4\") on node \"crc\" DevicePath \"\"" Feb 17 17:45:05 crc kubenswrapper[4694]: I0217 17:45:05.279337 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" event={"ID":"8928a6e5-597a-42c5-9f46-8ff5192f5b65","Type":"ContainerDied","Data":"04d62ef7ca97cb02e34cbacd74c9f8eb44bcd0e757dc45348be7c9d042938d1b"} Feb 17 17:45:05 crc kubenswrapper[4694]: I0217 17:45:05.279376 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522505-72vqv" Feb 17 17:45:05 crc kubenswrapper[4694]: I0217 17:45:05.279399 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d62ef7ca97cb02e34cbacd74c9f8eb44bcd0e757dc45348be7c9d042938d1b" Feb 17 17:45:05 crc kubenswrapper[4694]: I0217 17:45:05.337013 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829"] Feb 17 17:45:05 crc kubenswrapper[4694]: I0217 17:45:05.343925 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522460-qx829"] Feb 17 17:45:06 crc kubenswrapper[4694]: I0217 17:45:06.913895 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf31ae2-9449-4362-8a00-9e1fba466f0b" path="/var/lib/kubelet/pods/1bf31ae2-9449-4362-8a00-9e1fba466f0b/volumes" Feb 17 17:45:15 crc kubenswrapper[4694]: I0217 17:45:15.895387 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:45:15 crc kubenswrapper[4694]: E0217 17:45:15.896306 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:45:30 crc kubenswrapper[4694]: I0217 17:45:30.896007 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:45:30 crc kubenswrapper[4694]: E0217 17:45:30.896827 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:45:42 crc kubenswrapper[4694]: I0217 17:45:42.904492 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:45:42 crc kubenswrapper[4694]: E0217 17:45:42.905892 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:45:43 crc kubenswrapper[4694]: I0217 17:45:43.591781 4694 scope.go:117] "RemoveContainer" containerID="bacc1a5efafed37d82e4fb61794e376467f38113e2905c286f9a455fe615881c" Feb 17 17:45:43 crc kubenswrapper[4694]: I0217 17:45:43.637326 4694 scope.go:117] "RemoveContainer" containerID="2bac79601e7fedd63b3656671c82305768054b61a4128e337dbe3b48ff50900b" Feb 17 17:45:54 crc kubenswrapper[4694]: I0217 17:45:54.896140 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:45:54 crc kubenswrapper[4694]: E0217 17:45:54.897833 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:46:05 crc kubenswrapper[4694]: I0217 17:46:05.895585 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:46:05 crc kubenswrapper[4694]: E0217 17:46:05.896412 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:46:20 crc kubenswrapper[4694]: I0217 17:46:20.895499 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:46:20 crc kubenswrapper[4694]: E0217 17:46:20.896327 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:46:32 crc kubenswrapper[4694]: I0217 17:46:32.903207 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:46:32 crc kubenswrapper[4694]: E0217 17:46:32.904173 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:46:44 crc kubenswrapper[4694]: I0217 17:46:44.895058 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:46:44 crc kubenswrapper[4694]: E0217 17:46:44.895880 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:46:55 crc kubenswrapper[4694]: I0217 17:46:55.895173 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:46:55 crc kubenswrapper[4694]: E0217 17:46:55.895911 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:06 crc kubenswrapper[4694]: I0217 17:47:06.896204 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:47:06 crc kubenswrapper[4694]: E0217 17:47:06.896947 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.131876 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:10 crc kubenswrapper[4694]: E0217 17:47:10.133020 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8928a6e5-597a-42c5-9f46-8ff5192f5b65" containerName="collect-profiles" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.133044 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="8928a6e5-597a-42c5-9f46-8ff5192f5b65" containerName="collect-profiles" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.133648 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="8928a6e5-597a-42c5-9f46-8ff5192f5b65" containerName="collect-profiles" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.139328 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.142549 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.210463 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.210566 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkxx\" (UniqueName: \"kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.210695 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.312068 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.312179 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.312261 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkxx\" (UniqueName: \"kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.312669 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.312700 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.341488 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkxx\" (UniqueName: \"kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx\") pod \"redhat-marketplace-tlm5g\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.456678 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:10 crc kubenswrapper[4694]: I0217 17:47:10.956044 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:11 crc kubenswrapper[4694]: I0217 17:47:11.438540 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerDied","Data":"f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c"} Feb 17 17:47:11 crc kubenswrapper[4694]: I0217 17:47:11.438469 4694 generic.go:334] "Generic (PLEG): container finished" podID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerID="f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c" exitCode=0 Feb 17 17:47:11 crc kubenswrapper[4694]: I0217 17:47:11.438994 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerStarted","Data":"0ddec0ed2c9f39162e112f1ce18738cbef8f0124f959eb9496df5a6a139bb19e"} Feb 17 17:47:11 crc kubenswrapper[4694]: I0217 17:47:11.441380 4694 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 17:47:12 crc kubenswrapper[4694]: E0217 17:47:12.763276 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44e003c_2a34_4a5e_b80d_177d9795428f.slice/crio-7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44e003c_2a34_4a5e_b80d_177d9795428f.slice/crio-conmon-7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157.scope\": RecentStats: unable to find data in memory cache]" Feb 17 17:47:13 crc kubenswrapper[4694]: I0217 17:47:13.457630 4694 generic.go:334] "Generic (PLEG): container finished" podID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerID="7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157" exitCode=0 Feb 17 17:47:13 crc kubenswrapper[4694]: I0217 17:47:13.457682 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerDied","Data":"7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157"} Feb 17 17:47:14 crc kubenswrapper[4694]: I0217 17:47:14.473442 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerStarted","Data":"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5"} Feb 17 17:47:14 crc kubenswrapper[4694]: I0217 17:47:14.494487 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tlm5g" podStartSLOduration=2.001696574 podStartE2EDuration="4.494467159s" podCreationTimestamp="2026-02-17 17:47:10 +0000 UTC" firstStartedPulling="2026-02-17 17:47:11.440938088 +0000 UTC m=+3899.198013452" lastFinishedPulling="2026-02-17 17:47:13.933708713 +0000 UTC m=+3901.690784037" observedRunningTime="2026-02-17 17:47:14.493682789 +0000 UTC m=+3902.250758123" watchObservedRunningTime="2026-02-17 17:47:14.494467159 +0000 UTC m=+3902.251542503" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.335865 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qxn6/must-gather-bbwm4"] Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.338043 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.342227 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6qxn6"/"kube-root-ca.crt" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.342757 4694 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6qxn6"/"openshift-service-ca.crt" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.342105 4694 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6qxn6"/"default-dockercfg-cgf9w" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.386491 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6qxn6/must-gather-bbwm4"] Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.464283 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54m9v\" (UniqueName: \"kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.464414 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.565999 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54m9v\" (UniqueName: \"kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.566855 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.567343 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.584567 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54m9v\" (UniqueName: \"kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v\") pod \"must-gather-bbwm4\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:16 crc kubenswrapper[4694]: I0217 17:47:16.688849 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:47:17 crc kubenswrapper[4694]: I0217 17:47:17.150591 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6qxn6/must-gather-bbwm4"] Feb 17 17:47:17 crc kubenswrapper[4694]: I0217 17:47:17.512879 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" event={"ID":"7a9b22ef-bb0b-4584-96b2-d4c8178c5030","Type":"ContainerStarted","Data":"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059"} Feb 17 17:47:17 crc kubenswrapper[4694]: I0217 17:47:17.512931 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" event={"ID":"7a9b22ef-bb0b-4584-96b2-d4c8178c5030","Type":"ContainerStarted","Data":"6aab39e1747387a1890be3a69de5a7eb795e76c2c47289bf4e33f73f1e3415b7"} Feb 17 17:47:17 crc kubenswrapper[4694]: I0217 17:47:17.895377 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:47:17 crc kubenswrapper[4694]: E0217 17:47:17.895589 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:18 crc kubenswrapper[4694]: I0217 17:47:18.523266 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" event={"ID":"7a9b22ef-bb0b-4584-96b2-d4c8178c5030","Type":"ContainerStarted","Data":"b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7"} Feb 17 17:47:18 crc kubenswrapper[4694]: I0217 17:47:18.542416 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" podStartSLOduration=2.5423915299999997 podStartE2EDuration="2.54239153s" podCreationTimestamp="2026-02-17 17:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:18.537900308 +0000 UTC m=+3906.294975642" watchObservedRunningTime="2026-02-17 17:47:18.54239153 +0000 UTC m=+3906.299466884" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.456860 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.457824 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.505782 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.581178 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.749881 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.911689 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-7rqql"] Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.912858 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.963757 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrk72\" (UniqueName: \"kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:20 crc kubenswrapper[4694]: I0217 17:47:20.963866 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.065114 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrk72\" (UniqueName: \"kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.065236 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.065356 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.084397 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrk72\" (UniqueName: \"kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72\") pod \"crc-debug-7rqql\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.235265 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.551501 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" event={"ID":"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb","Type":"ContainerStarted","Data":"9c61d4e84d11b236257e33c3f0d447c4b24496cb630cfb89e4ba9ef5f4be7ce6"} Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.551879 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" event={"ID":"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb","Type":"ContainerStarted","Data":"0574d760a88dc5c0c8b4cb46daa8453a830199e0f155d6189e84d0b6e7d63c31"} Feb 17 17:47:21 crc kubenswrapper[4694]: I0217 17:47:21.575017 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" podStartSLOduration=1.57500143 podStartE2EDuration="1.57500143s" podCreationTimestamp="2026-02-17 17:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 17:47:21.572530908 +0000 UTC m=+3909.329606232" watchObservedRunningTime="2026-02-17 17:47:21.57500143 +0000 UTC m=+3909.332076754" Feb 17 17:47:22 crc kubenswrapper[4694]: I0217 17:47:22.565573 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tlm5g" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="registry-server" containerID="cri-o://781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5" gracePeriod=2 Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.194391 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.209830 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities\") pod \"b44e003c-2a34-4a5e-b80d-177d9795428f\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.210024 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qkxx\" (UniqueName: \"kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx\") pod \"b44e003c-2a34-4a5e-b80d-177d9795428f\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.210072 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content\") pod \"b44e003c-2a34-4a5e-b80d-177d9795428f\" (UID: \"b44e003c-2a34-4a5e-b80d-177d9795428f\") " Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.211127 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities" (OuterVolumeSpecName: "utilities") pod "b44e003c-2a34-4a5e-b80d-177d9795428f" (UID: "b44e003c-2a34-4a5e-b80d-177d9795428f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.216958 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx" (OuterVolumeSpecName: "kube-api-access-6qkxx") pod "b44e003c-2a34-4a5e-b80d-177d9795428f" (UID: "b44e003c-2a34-4a5e-b80d-177d9795428f"). InnerVolumeSpecName "kube-api-access-6qkxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.234141 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b44e003c-2a34-4a5e-b80d-177d9795428f" (UID: "b44e003c-2a34-4a5e-b80d-177d9795428f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.312456 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.312491 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44e003c-2a34-4a5e-b80d-177d9795428f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.312503 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qkxx\" (UniqueName: \"kubernetes.io/projected/b44e003c-2a34-4a5e-b80d-177d9795428f-kube-api-access-6qkxx\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.576221 4694 generic.go:334] "Generic (PLEG): container finished" podID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerID="781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5" exitCode=0 Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.576267 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerDied","Data":"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5"} Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.576292 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlm5g" event={"ID":"b44e003c-2a34-4a5e-b80d-177d9795428f","Type":"ContainerDied","Data":"0ddec0ed2c9f39162e112f1ce18738cbef8f0124f959eb9496df5a6a139bb19e"} Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.576308 4694 scope.go:117] "RemoveContainer" containerID="781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.576450 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlm5g" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.623310 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.626001 4694 scope.go:117] "RemoveContainer" containerID="7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.634021 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlm5g"] Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.649438 4694 scope.go:117] "RemoveContainer" containerID="f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.702124 4694 scope.go:117] "RemoveContainer" containerID="781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5" Feb 17 17:47:23 crc kubenswrapper[4694]: E0217 17:47:23.702649 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5\": container with ID starting with 781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5 not found: ID does not exist" containerID="781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.702688 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5"} err="failed to get container status \"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5\": rpc error: code = NotFound desc = could not find container \"781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5\": container with ID starting with 781d1763805ac0dfce622c184fb56b7777293f0cec3884619f53cb2aabe388a5 not found: ID does not exist" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.702717 4694 scope.go:117] "RemoveContainer" containerID="7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157" Feb 17 17:47:23 crc kubenswrapper[4694]: E0217 17:47:23.703109 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157\": container with ID starting with 7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157 not found: ID does not exist" containerID="7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.703223 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157"} err="failed to get container status \"7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157\": rpc error: code = NotFound desc = could not find container \"7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157\": container with ID starting with 7c944dbe955a53a309bc936332269da6ef6a6afc8be3e2390346f046ce25d157 not found: ID does not exist" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.703318 4694 scope.go:117] "RemoveContainer" containerID="f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c" Feb 17 17:47:23 crc kubenswrapper[4694]: E0217 17:47:23.703746 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c\": container with ID starting with f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c not found: ID does not exist" containerID="f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c" Feb 17 17:47:23 crc kubenswrapper[4694]: I0217 17:47:23.703777 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c"} err="failed to get container status \"f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c\": rpc error: code = NotFound desc = could not find container \"f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c\": container with ID starting with f7ca00af335ecaf728636d351cb59ed4c3bc56599ab85ba7a76afb8d4a073c8c not found: ID does not exist" Feb 17 17:47:24 crc kubenswrapper[4694]: I0217 17:47:24.907094 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" path="/var/lib/kubelet/pods/b44e003c-2a34-4a5e-b80d-177d9795428f/volumes" Feb 17 17:47:28 crc kubenswrapper[4694]: I0217 17:47:28.896529 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:47:28 crc kubenswrapper[4694]: E0217 17:47:28.897137 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:43 crc kubenswrapper[4694]: I0217 17:47:43.895354 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:47:43 crc kubenswrapper[4694]: E0217 17:47:43.896101 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:54 crc kubenswrapper[4694]: I0217 17:47:54.855919 4694 generic.go:334] "Generic (PLEG): container finished" podID="ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" containerID="9c61d4e84d11b236257e33c3f0d447c4b24496cb630cfb89e4ba9ef5f4be7ce6" exitCode=0 Feb 17 17:47:54 crc kubenswrapper[4694]: I0217 17:47:54.856021 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" event={"ID":"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb","Type":"ContainerDied","Data":"9c61d4e84d11b236257e33c3f0d447c4b24496cb630cfb89e4ba9ef5f4be7ce6"} Feb 17 17:47:54 crc kubenswrapper[4694]: I0217 17:47:54.895523 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:47:54 crc kubenswrapper[4694]: E0217 17:47:54.896110 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:47:55 crc kubenswrapper[4694]: I0217 17:47:55.988153 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.022022 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrk72\" (UniqueName: \"kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72\") pod \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.022135 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host\") pod \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\" (UID: \"ab2c1520-98f1-4379-aeca-c0aa4d63a5fb\") " Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.022451 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host" (OuterVolumeSpecName: "host") pod "ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" (UID: "ab2c1520-98f1-4379-aeca-c0aa4d63a5fb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.033030 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-7rqql"] Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.036974 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72" (OuterVolumeSpecName: "kube-api-access-mrk72") pod "ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" (UID: "ab2c1520-98f1-4379-aeca-c0aa4d63a5fb"). InnerVolumeSpecName "kube-api-access-mrk72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.041469 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-7rqql"] Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.124141 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrk72\" (UniqueName: \"kubernetes.io/projected/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-kube-api-access-mrk72\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.124178 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.788702 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:47:56 crc kubenswrapper[4694]: E0217 17:47:56.789160 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="registry-server" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789187 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="registry-server" Feb 17 17:47:56 crc kubenswrapper[4694]: E0217 17:47:56.789216 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" containerName="container-00" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789223 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" containerName="container-00" Feb 17 17:47:56 crc kubenswrapper[4694]: E0217 17:47:56.789246 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="extract-content" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789253 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="extract-content" Feb 17 17:47:56 crc kubenswrapper[4694]: E0217 17:47:56.789264 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="extract-utilities" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789270 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="extract-utilities" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789426 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" containerName="container-00" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.789452 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="b44e003c-2a34-4a5e-b80d-177d9795428f" containerName="registry-server" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.790729 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.798980 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.837881 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.838231 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtvm\" (UniqueName: \"kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.838309 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.874921 4694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0574d760a88dc5c0c8b4cb46daa8453a830199e0f155d6189e84d0b6e7d63c31" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.875048 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-7rqql" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.905898 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2c1520-98f1-4379-aeca-c0aa4d63a5fb" path="/var/lib/kubelet/pods/ab2c1520-98f1-4379-aeca-c0aa4d63a5fb/volumes" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.940170 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.940227 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtvm\" (UniqueName: \"kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.940353 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.941547 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.941681 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:56 crc kubenswrapper[4694]: I0217 17:47:56.959661 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtvm\" (UniqueName: \"kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm\") pod \"community-operators-ccs88\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.114103 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.431723 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-z8vl6"] Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.434995 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.474261 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.551359 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.551523 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdrk\" (UniqueName: \"kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.653558 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdrk\" (UniqueName: \"kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.653772 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.653959 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.677264 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdrk\" (UniqueName: \"kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk\") pod \"crc-debug-z8vl6\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.858566 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:47:57 crc kubenswrapper[4694]: I0217 17:47:57.886974 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerStarted","Data":"3aa62fb3deed0f0a7c49d9384c0fa46af230554645ba9cabb3f0fda72e16dabd"} Feb 17 17:47:57 crc kubenswrapper[4694]: W0217 17:47:57.888586 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c83d40b_3018_492a_a5d4_b546fb465117.slice/crio-c8c97e010d593cfea2530dcb5695fc3881447b20ca1c1dcee3d9b0b0c48a1200 WatchSource:0}: Error finding container c8c97e010d593cfea2530dcb5695fc3881447b20ca1c1dcee3d9b0b0c48a1200: Status 404 returned error can't find the container with id c8c97e010d593cfea2530dcb5695fc3881447b20ca1c1dcee3d9b0b0c48a1200 Feb 17 17:47:58 crc kubenswrapper[4694]: I0217 17:47:58.898790 4694 generic.go:334] "Generic (PLEG): container finished" podID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerID="ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068" exitCode=0 Feb 17 17:47:58 crc kubenswrapper[4694]: I0217 17:47:58.901199 4694 generic.go:334] "Generic (PLEG): container finished" podID="9c83d40b-3018-492a-a5d4-b546fb465117" containerID="eedd31dfcd6d09bdec7cf17f18dfc6c9eeb86ca8a2c1728e181e4e41b4567d48" exitCode=0 Feb 17 17:47:58 crc kubenswrapper[4694]: I0217 17:47:58.905265 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerDied","Data":"ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068"} Feb 17 17:47:58 crc kubenswrapper[4694]: I0217 17:47:58.905302 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" event={"ID":"9c83d40b-3018-492a-a5d4-b546fb465117","Type":"ContainerDied","Data":"eedd31dfcd6d09bdec7cf17f18dfc6c9eeb86ca8a2c1728e181e4e41b4567d48"} Feb 17 17:47:58 crc kubenswrapper[4694]: I0217 17:47:58.905322 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" event={"ID":"9c83d40b-3018-492a-a5d4-b546fb465117","Type":"ContainerStarted","Data":"c8c97e010d593cfea2530dcb5695fc3881447b20ca1c1dcee3d9b0b0c48a1200"} Feb 17 17:47:59 crc kubenswrapper[4694]: I0217 17:47:59.322104 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-z8vl6"] Feb 17 17:47:59 crc kubenswrapper[4694]: I0217 17:47:59.329620 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-z8vl6"] Feb 17 17:47:59 crc kubenswrapper[4694]: I0217 17:47:59.919824 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerStarted","Data":"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb"} Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.097377 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.199421 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host\") pod \"9c83d40b-3018-492a-a5d4-b546fb465117\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.199565 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdrk\" (UniqueName: \"kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk\") pod \"9c83d40b-3018-492a-a5d4-b546fb465117\" (UID: \"9c83d40b-3018-492a-a5d4-b546fb465117\") " Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.200629 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host" (OuterVolumeSpecName: "host") pod "9c83d40b-3018-492a-a5d4-b546fb465117" (UID: "9c83d40b-3018-492a-a5d4-b546fb465117"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.201103 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c83d40b-3018-492a-a5d4-b546fb465117-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.217216 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk" (OuterVolumeSpecName: "kube-api-access-qgdrk") pod "9c83d40b-3018-492a-a5d4-b546fb465117" (UID: "9c83d40b-3018-492a-a5d4-b546fb465117"). InnerVolumeSpecName "kube-api-access-qgdrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.303531 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdrk\" (UniqueName: \"kubernetes.io/projected/9c83d40b-3018-492a-a5d4-b546fb465117-kube-api-access-qgdrk\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.534971 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-j4fst"] Feb 17 17:48:00 crc kubenswrapper[4694]: E0217 17:48:00.535400 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c83d40b-3018-492a-a5d4-b546fb465117" containerName="container-00" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.535424 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c83d40b-3018-492a-a5d4-b546fb465117" containerName="container-00" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.535860 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c83d40b-3018-492a-a5d4-b546fb465117" containerName="container-00" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.536439 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.710744 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6tjc\" (UniqueName: \"kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.711084 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.812838 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6tjc\" (UniqueName: \"kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.812922 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.813124 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.827970 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6tjc\" (UniqueName: \"kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc\") pod \"crc-debug-j4fst\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.856142 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.927305 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c83d40b-3018-492a-a5d4-b546fb465117" path="/var/lib/kubelet/pods/9c83d40b-3018-492a-a5d4-b546fb465117/volumes" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.942809 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" event={"ID":"fc41bc90-6a5f-4f89-a05e-1952af939f40","Type":"ContainerStarted","Data":"1f53b2583b25320fea3f6e00df3b266c9f146c155c77d8d577716034df8c536e"} Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.946554 4694 scope.go:117] "RemoveContainer" containerID="eedd31dfcd6d09bdec7cf17f18dfc6c9eeb86ca8a2c1728e181e4e41b4567d48" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.946715 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-z8vl6" Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.951776 4694 generic.go:334] "Generic (PLEG): container finished" podID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerID="695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb" exitCode=0 Feb 17 17:48:00 crc kubenswrapper[4694]: I0217 17:48:00.951815 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerDied","Data":"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb"} Feb 17 17:48:01 crc kubenswrapper[4694]: I0217 17:48:01.961301 4694 generic.go:334] "Generic (PLEG): container finished" podID="fc41bc90-6a5f-4f89-a05e-1952af939f40" containerID="0bf91ef78f36741efb90e2de04a8ebe008cc4b3953ff95b3cf675435e78460e6" exitCode=0 Feb 17 17:48:01 crc kubenswrapper[4694]: I0217 17:48:01.961514 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" event={"ID":"fc41bc90-6a5f-4f89-a05e-1952af939f40","Type":"ContainerDied","Data":"0bf91ef78f36741efb90e2de04a8ebe008cc4b3953ff95b3cf675435e78460e6"} Feb 17 17:48:02 crc kubenswrapper[4694]: I0217 17:48:02.004905 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-j4fst"] Feb 17 17:48:02 crc kubenswrapper[4694]: I0217 17:48:02.013345 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qxn6/crc-debug-j4fst"] Feb 17 17:48:02 crc kubenswrapper[4694]: I0217 17:48:02.974550 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerStarted","Data":"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6"} Feb 17 17:48:02 crc kubenswrapper[4694]: I0217 17:48:02.998171 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ccs88" podStartSLOduration=4.077765055 podStartE2EDuration="6.998151554s" podCreationTimestamp="2026-02-17 17:47:56 +0000 UTC" firstStartedPulling="2026-02-17 17:47:58.900280206 +0000 UTC m=+3946.657355530" lastFinishedPulling="2026-02-17 17:48:01.820666695 +0000 UTC m=+3949.577742029" observedRunningTime="2026-02-17 17:48:02.992544384 +0000 UTC m=+3950.749619698" watchObservedRunningTime="2026-02-17 17:48:02.998151554 +0000 UTC m=+3950.755226878" Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.089525 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.157689 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6tjc\" (UniqueName: \"kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc\") pod \"fc41bc90-6a5f-4f89-a05e-1952af939f40\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.157783 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host\") pod \"fc41bc90-6a5f-4f89-a05e-1952af939f40\" (UID: \"fc41bc90-6a5f-4f89-a05e-1952af939f40\") " Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.157945 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host" (OuterVolumeSpecName: "host") pod "fc41bc90-6a5f-4f89-a05e-1952af939f40" (UID: "fc41bc90-6a5f-4f89-a05e-1952af939f40"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.158384 4694 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc41bc90-6a5f-4f89-a05e-1952af939f40-host\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.162693 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc" (OuterVolumeSpecName: "kube-api-access-j6tjc") pod "fc41bc90-6a5f-4f89-a05e-1952af939f40" (UID: "fc41bc90-6a5f-4f89-a05e-1952af939f40"). InnerVolumeSpecName "kube-api-access-j6tjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:48:03 crc kubenswrapper[4694]: I0217 17:48:03.259988 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6tjc\" (UniqueName: \"kubernetes.io/projected/fc41bc90-6a5f-4f89-a05e-1952af939f40-kube-api-access-j6tjc\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:04 crc kubenswrapper[4694]: I0217 17:48:04.002672 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/crc-debug-j4fst" Feb 17 17:48:04 crc kubenswrapper[4694]: I0217 17:48:04.002913 4694 scope.go:117] "RemoveContainer" containerID="0bf91ef78f36741efb90e2de04a8ebe008cc4b3953ff95b3cf675435e78460e6" Feb 17 17:48:04 crc kubenswrapper[4694]: E0217 17:48:04.050074 4694 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc41bc90_6a5f_4f89_a05e_1952af939f40.slice/crio-1f53b2583b25320fea3f6e00df3b266c9f146c155c77d8d577716034df8c536e\": RecentStats: unable to find data in memory cache]" Feb 17 17:48:04 crc kubenswrapper[4694]: I0217 17:48:04.906516 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc41bc90-6a5f-4f89-a05e-1952af939f40" path="/var/lib/kubelet/pods/fc41bc90-6a5f-4f89-a05e-1952af939f40/volumes" Feb 17 17:48:05 crc kubenswrapper[4694]: I0217 17:48:05.895940 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:48:05 crc kubenswrapper[4694]: E0217 17:48:05.896223 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:48:07 crc kubenswrapper[4694]: I0217 17:48:07.114508 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:07 crc kubenswrapper[4694]: I0217 17:48:07.114831 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:07 crc kubenswrapper[4694]: I0217 17:48:07.166861 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:08 crc kubenswrapper[4694]: I0217 17:48:08.504302 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:08 crc kubenswrapper[4694]: I0217 17:48:08.573803 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:48:10 crc kubenswrapper[4694]: I0217 17:48:10.061492 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ccs88" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="registry-server" containerID="cri-o://42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6" gracePeriod=2 Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.000484 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.072177 4694 generic.go:334] "Generic (PLEG): container finished" podID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerID="42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6" exitCode=0 Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.072218 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerDied","Data":"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6"} Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.072242 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccs88" event={"ID":"db20102e-2ea3-4877-baeb-c8aa97d86a19","Type":"ContainerDied","Data":"3aa62fb3deed0f0a7c49d9384c0fa46af230554645ba9cabb3f0fda72e16dabd"} Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.072276 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccs88" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.072289 4694 scope.go:117] "RemoveContainer" containerID="42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.097716 4694 scope.go:117] "RemoveContainer" containerID="695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.119643 4694 scope.go:117] "RemoveContainer" containerID="ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.121556 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdtvm\" (UniqueName: \"kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm\") pod \"db20102e-2ea3-4877-baeb-c8aa97d86a19\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.121755 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content\") pod \"db20102e-2ea3-4877-baeb-c8aa97d86a19\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.121940 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities\") pod \"db20102e-2ea3-4877-baeb-c8aa97d86a19\" (UID: \"db20102e-2ea3-4877-baeb-c8aa97d86a19\") " Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.124043 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities" (OuterVolumeSpecName: "utilities") pod "db20102e-2ea3-4877-baeb-c8aa97d86a19" (UID: "db20102e-2ea3-4877-baeb-c8aa97d86a19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.134013 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm" (OuterVolumeSpecName: "kube-api-access-cdtvm") pod "db20102e-2ea3-4877-baeb-c8aa97d86a19" (UID: "db20102e-2ea3-4877-baeb-c8aa97d86a19"). InnerVolumeSpecName "kube-api-access-cdtvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.202375 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db20102e-2ea3-4877-baeb-c8aa97d86a19" (UID: "db20102e-2ea3-4877-baeb-c8aa97d86a19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.211731 4694 scope.go:117] "RemoveContainer" containerID="42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6" Feb 17 17:48:11 crc kubenswrapper[4694]: E0217 17:48:11.212115 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6\": container with ID starting with 42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6 not found: ID does not exist" containerID="42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.212154 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6"} err="failed to get container status \"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6\": rpc error: code = NotFound desc = could not find container \"42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6\": container with ID starting with 42e162e179b891272a0ee24bf4c273ea035bd8ad938c1fb1b0b084267a9b71f6 not found: ID does not exist" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.212181 4694 scope.go:117] "RemoveContainer" containerID="695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb" Feb 17 17:48:11 crc kubenswrapper[4694]: E0217 17:48:11.212659 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb\": container with ID starting with 695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb not found: ID does not exist" containerID="695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.212690 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb"} err="failed to get container status \"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb\": rpc error: code = NotFound desc = could not find container \"695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb\": container with ID starting with 695bb8a8e964ec098c384b138b8470555f56ee289c50905aa73f1451045b3cbb not found: ID does not exist" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.212732 4694 scope.go:117] "RemoveContainer" containerID="ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068" Feb 17 17:48:11 crc kubenswrapper[4694]: E0217 17:48:11.212985 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068\": container with ID starting with ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068 not found: ID does not exist" containerID="ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.213011 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068"} err="failed to get container status \"ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068\": rpc error: code = NotFound desc = could not find container \"ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068\": container with ID starting with ff339c6fb7c6006d3db1177de2c82ef2c80b9976eae0d383dfd2742f21a20068 not found: ID does not exist" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.224431 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.224457 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db20102e-2ea3-4877-baeb-c8aa97d86a19-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.224472 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdtvm\" (UniqueName: \"kubernetes.io/projected/db20102e-2ea3-4877-baeb-c8aa97d86a19-kube-api-access-cdtvm\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.449944 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:48:11 crc kubenswrapper[4694]: I0217 17:48:11.468594 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ccs88"] Feb 17 17:48:12 crc kubenswrapper[4694]: I0217 17:48:12.906081 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" path="/var/lib/kubelet/pods/db20102e-2ea3-4877-baeb-c8aa97d86a19/volumes" Feb 17 17:48:20 crc kubenswrapper[4694]: I0217 17:48:20.895755 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:48:21 crc kubenswrapper[4694]: I0217 17:48:21.157057 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b"} Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.904736 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:33 crc kubenswrapper[4694]: E0217 17:48:33.905604 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="extract-content" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905666 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="extract-content" Feb 17 17:48:33 crc kubenswrapper[4694]: E0217 17:48:33.905689 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="registry-server" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905695 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="registry-server" Feb 17 17:48:33 crc kubenswrapper[4694]: E0217 17:48:33.905719 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc41bc90-6a5f-4f89-a05e-1952af939f40" containerName="container-00" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905726 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc41bc90-6a5f-4f89-a05e-1952af939f40" containerName="container-00" Feb 17 17:48:33 crc kubenswrapper[4694]: E0217 17:48:33.905741 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="extract-utilities" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905746 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="extract-utilities" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905912 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="db20102e-2ea3-4877-baeb-c8aa97d86a19" containerName="registry-server" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.905927 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc41bc90-6a5f-4f89-a05e-1952af939f40" containerName="container-00" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.907276 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.920110 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.967278 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.967393 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztd4k\" (UniqueName: \"kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:33 crc kubenswrapper[4694]: I0217 17:48:33.967493 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.069570 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.069948 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztd4k\" (UniqueName: \"kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.070011 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.070014 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.070235 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.091472 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cff554946-ddg9w_1843aa1a-460a-42ec-adb2-b20b48c71a90/barbican-api/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.100336 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztd4k\" (UniqueName: \"kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k\") pod \"redhat-operators-r66bf\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.227109 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.228588 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cff554946-ddg9w_1843aa1a-460a-42ec-adb2-b20b48c71a90/barbican-api-log/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.298171 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694d8d5c8-nq2bp_d79bfe1a-e161-41b0-8eed-0f1879b1f990/barbican-keystone-listener/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.445908 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694d8d5c8-nq2bp_d79bfe1a-e161-41b0-8eed-0f1879b1f990/barbican-keystone-listener-log/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.477666 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b65b64c9-dlmdm_0470d53c-a76c-4cf3-8f95-1ae293182645/barbican-worker/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.532387 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6b65b64c9-dlmdm_0470d53c-a76c-4cf3-8f95-1ae293182645/barbican-worker-log/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.747876 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s5fp5_c7b0dfe8-6f12-4b98-a4a2-ee28ce89971d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:34 crc kubenswrapper[4694]: I0217 17:48:34.792673 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/ceilometer-central-agent/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.029425 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/ceilometer-notification-agent/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.053566 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/sg-core/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.056765 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_20e5b774-d9a5-4a32-8e29-63543214e090/proxy-httpd/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.267744 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:35 crc kubenswrapper[4694]: W0217 17:48:35.274777 4694 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode36f83bb_fee1_4584_ae45_a4b6345b7c41.slice/crio-10877a161dcc05b420d7ad801d56570f22cb4fed01aec4823c1165518e26d42e WatchSource:0}: Error finding container 10877a161dcc05b420d7ad801d56570f22cb4fed01aec4823c1165518e26d42e: Status 404 returned error can't find the container with id 10877a161dcc05b420d7ad801d56570f22cb4fed01aec4823c1165518e26d42e Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.304664 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e5767e0e-627e-4e5a-9ee9-c150b7bc2d72/cinder-api/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.307680 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerStarted","Data":"10877a161dcc05b420d7ad801d56570f22cb4fed01aec4823c1165518e26d42e"} Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.366172 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e5767e0e-627e-4e5a-9ee9-c150b7bc2d72/cinder-api-log/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.417241 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_83a8e274-5312-4be8-9f81-c7b13a2effe1/cinder-scheduler/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.658071 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_83a8e274-5312-4be8-9f81-c7b13a2effe1/probe/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.709207 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rtr4_af1e17e7-cd69-4f0f-8e3f-e36399e001a8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.860446 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zqw5c_3b2a9feb-de71-42ff-b0ae-f4697f525469/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:35 crc kubenswrapper[4694]: I0217 17:48:35.971598 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/init/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.134418 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/init/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.232407 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lbchr_ef5a14b7-490f-48b6-a150-6437a2a18fda/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.244991 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-l2p9l_c65ebfa5-bbb3-4011-8593-8cfbd2765254/dnsmasq-dns/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.321921 4694 generic.go:334] "Generic (PLEG): container finished" podID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerID="b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c" exitCode=0 Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.322076 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerDied","Data":"b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c"} Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.425488 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b48a41b5-4a74-4883-a067-660e674ceecb/glance-httpd/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.425587 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b48a41b5-4a74-4883-a067-660e674ceecb/glance-log/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.584054 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e061fc0e-dd5f-429f-8275-0a744dfc846d/glance-httpd/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.606878 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e061fc0e-dd5f-429f-8275-0a744dfc846d/glance-log/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.791707 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b8f4f9856-rcwl9_17711b82-3f49-41da-b17d-785c70869492/horizon/0.log" Feb 17 17:48:36 crc kubenswrapper[4694]: I0217 17:48:36.915817 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tf7q5_fc2a86d9-61f8-4af3-9835-2aeea9736b84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.163238 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rwrb8_a74ac9aa-ac88-4b13-b10b-9fe0f9195f35/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.194592 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7b8f4f9856-rcwl9_17711b82-3f49-41da-b17d-785c70869492/horizon-log/0.log" Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.332309 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerStarted","Data":"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923"} Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.405286 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8667649c99-28rzh_b0aae110-6e5c-4f32-95d9-b4b3429ca622/keystone-api/0.log" Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.421868 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_eeee4610-5faa-46a3-815b-2b04150c9abf/kube-state-metrics/0.log" Feb 17 17:48:37 crc kubenswrapper[4694]: I0217 17:48:37.663063 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bjjdh_032fed18-d394-4743-ac9d-efa8d472bbc2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:38 crc kubenswrapper[4694]: I0217 17:48:38.111121 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-699759854f-bj949_c626c95e-85d3-4ba2-8453-060b57d2ca05/neutron-httpd/0.log" Feb 17 17:48:38 crc kubenswrapper[4694]: I0217 17:48:38.132787 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-699759854f-bj949_c626c95e-85d3-4ba2-8453-060b57d2ca05/neutron-api/0.log" Feb 17 17:48:38 crc kubenswrapper[4694]: I0217 17:48:38.256093 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8pvk4_55da2ff6-efab-4cee-acb9-b0a04edc8980/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:38 crc kubenswrapper[4694]: I0217 17:48:38.782495 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8083aa6-ef30-42ca-b979-e21a0697ce79/nova-api-log/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.155514 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8083aa6-ef30-42ca-b979-e21a0697ce79/nova-api-api/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.330004 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6d7b20d4-ec67-4732-bb23-97f5dacf1af1/nova-cell0-conductor-conductor/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.484625 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_91495551-d244-497c-b8f1-376b3206a3aa/nova-cell1-conductor-conductor/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.568468 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9894d581-eaea-45f8-a4ca-1a73c9fc778b/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.648211 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-ddccg_f97bc145-9375-4a33-8b64-699355feb0fd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:39 crc kubenswrapper[4694]: I0217 17:48:39.923412 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f396879b-c24b-478f-b98f-24347a13a36d/nova-metadata-log/0.log" Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.276168 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/mysql-bootstrap/0.log" Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.343072 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_204bee86-a2fa-4fb3-bb90-60f67cb66bc7/nova-scheduler-scheduler/0.log" Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.355784 4694 generic.go:334] "Generic (PLEG): container finished" podID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerID="5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923" exitCode=0 Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.355828 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerDied","Data":"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923"} Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.443242 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/mysql-bootstrap/0.log" Feb 17 17:48:40 crc kubenswrapper[4694]: I0217 17:48:40.453911 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a7cab10b-d837-44e6-81c7-8bfdb36a4d3c/galera/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.276778 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f396879b-c24b-478f-b98f-24347a13a36d/nova-metadata-metadata/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.336655 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/mysql-bootstrap/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.371457 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerStarted","Data":"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b"} Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.400082 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r66bf" podStartSLOduration=3.983650853 podStartE2EDuration="8.400064405s" podCreationTimestamp="2026-02-17 17:48:33 +0000 UTC" firstStartedPulling="2026-02-17 17:48:36.324151424 +0000 UTC m=+3984.081226748" lastFinishedPulling="2026-02-17 17:48:40.740564976 +0000 UTC m=+3988.497640300" observedRunningTime="2026-02-17 17:48:41.392703832 +0000 UTC m=+3989.149779156" watchObservedRunningTime="2026-02-17 17:48:41.400064405 +0000 UTC m=+3989.157139729" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.594254 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/mysql-bootstrap/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.600743 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a0c805cf-d310-4594-8584-1061330e4c94/galera/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.611381 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_30281bdf-b35e-4ac1-8cde-8a333e24f564/openstackclient/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.874758 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server-init/0.log" Feb 17 17:48:41 crc kubenswrapper[4694]: I0217 17:48:41.942175 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xtmfm_e1c89368-7041-4c84-8ba6-624d0f0b695e/openstack-network-exporter/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.190246 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.239414 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovsdb-server-init/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.291869 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-q88vj_adef318b-03c0-4281-8b77-30b76a8904e6/ovs-vswitchd/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.465831 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-stczv_45514b0e-57f3-494a-823a-2a0f0c2f728d/ovn-controller/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.547487 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-rwf25_a7d24b5a-8b19-4532-a8ca-b34ebad591d1/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.755694 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10287785-ab79-4801-903b-0b4acdc8aca8/openstack-network-exporter/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.838099 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf651b7-0b06-4b95-916e-e7fe6630d272/openstack-network-exporter/0.log" Feb 17 17:48:42 crc kubenswrapper[4694]: I0217 17:48:42.916875 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_10287785-ab79-4801-903b-0b4acdc8aca8/ovn-northd/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.014949 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_abf651b7-0b06-4b95-916e-e7fe6630d272/ovsdbserver-nb/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.260928 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_391b1870-6558-40d0-be12-d31b3a57ed32/openstack-network-exporter/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.274318 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_391b1870-6558-40d0-be12-d31b3a57ed32/ovsdbserver-sb/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.385045 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7569cc6b-bv5js_1a8ff002-04b9-4dfa-af27-36823f7918a8/placement-api/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.546337 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/setup-container/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.643213 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7569cc6b-bv5js_1a8ff002-04b9-4dfa-af27-36823f7918a8/placement-log/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.739634 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/setup-container/0.log" Feb 17 17:48:43 crc kubenswrapper[4694]: I0217 17:48:43.849993 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_181e3039-f77f-47e6-acef-e1dcd93d30f8/rabbitmq/0.log" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.230009 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.230767 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.443825 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/setup-container/0.log" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.596587 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/setup-container/0.log" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.727033 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9698ccc3-769b-43aa-a4bf-f7c95342555a/rabbitmq/0.log" Feb 17 17:48:44 crc kubenswrapper[4694]: I0217 17:48:44.871734 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l2g7s_fe2671a8-04cd-4b09-ba6a-e6250762985e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.050703 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-d487f_45da847c-7705-408d-a3c4-b05253e15d3f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.253556 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wfs8m_f7ab42be-d837-4b1d-8d80-164f92fc205a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.291914 4694 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r66bf" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="registry-server" probeResult="failure" output=< Feb 17 17:48:45 crc kubenswrapper[4694]: timeout: failed to connect service ":50051" within 1s Feb 17 17:48:45 crc kubenswrapper[4694]: > Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.321701 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hz7mj_ef784368-4cf0-42fb-b4c5-b5ca19fe472a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.621058 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-vsfcq_4f46d450-c4d9-4b9e-bcb3-5e3ea915f59a/ssh-known-hosts-edpm-deployment/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.745995 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f54df747-vdnkk_c0f37d92-d923-43f2-807f-d52cd9003a2c/proxy-httpd/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.803475 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9f54df747-vdnkk_c0f37d92-d923-43f2-807f-d52cd9003a2c/proxy-server/0.log" Feb 17 17:48:45 crc kubenswrapper[4694]: I0217 17:48:45.912902 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-q6jc9_78214f4f-59b9-4ed5-bd7c-8fb4560c6ef3/swift-ring-rebalance/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.035917 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-auditor/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.181398 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-reaper/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.217556 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-replicator/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.296710 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/account-server/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.405697 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-replicator/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.465484 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-auditor/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.470789 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-server/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.559212 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/container-updater/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.653612 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-auditor/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.673727 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-expirer/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.734134 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-replicator/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.784383 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-server/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.891064 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/object-updater/0.log" Feb 17 17:48:46 crc kubenswrapper[4694]: I0217 17:48:46.936641 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/rsync/0.log" Feb 17 17:48:47 crc kubenswrapper[4694]: I0217 17:48:47.044339 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_852493c7-97b4-4850-9ef3-44ec598d9d1a/swift-recon-cron/0.log" Feb 17 17:48:47 crc kubenswrapper[4694]: I0217 17:48:47.205858 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7jtpj_fe57a3c1-260f-4f46-b977-82656c0ad9d6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:47 crc kubenswrapper[4694]: I0217 17:48:47.326937 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5a4a02dc-9cc2-4445-9624-359734b69ae6/tempest-tests-tempest-tests-runner/0.log" Feb 17 17:48:47 crc kubenswrapper[4694]: I0217 17:48:47.473062 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6cf976c8-b5ae-4b82-ad19-5d28b6196b80/test-operator-logs-container/0.log" Feb 17 17:48:47 crc kubenswrapper[4694]: I0217 17:48:47.587729 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2nmdv_bfbc588a-92ae-49ee-bcad-433dc28ecad7/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 17:48:52 crc kubenswrapper[4694]: I0217 17:48:52.996057 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_11f9dba1-ca35-4d40-b07b-44a141b8a80b/memcached/0.log" Feb 17 17:48:54 crc kubenswrapper[4694]: I0217 17:48:54.276835 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:54 crc kubenswrapper[4694]: I0217 17:48:54.320094 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:54 crc kubenswrapper[4694]: I0217 17:48:54.512331 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:55 crc kubenswrapper[4694]: I0217 17:48:55.482228 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r66bf" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="registry-server" containerID="cri-o://7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b" gracePeriod=2 Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.091872 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.157849 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztd4k\" (UniqueName: \"kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k\") pod \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.157913 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content\") pod \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.158027 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities\") pod \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\" (UID: \"e36f83bb-fee1-4584-ae45-a4b6345b7c41\") " Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.159169 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities" (OuterVolumeSpecName: "utilities") pod "e36f83bb-fee1-4584-ae45-a4b6345b7c41" (UID: "e36f83bb-fee1-4584-ae45-a4b6345b7c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.164711 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k" (OuterVolumeSpecName: "kube-api-access-ztd4k") pod "e36f83bb-fee1-4584-ae45-a4b6345b7c41" (UID: "e36f83bb-fee1-4584-ae45-a4b6345b7c41"). InnerVolumeSpecName "kube-api-access-ztd4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.260022 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.260053 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztd4k\" (UniqueName: \"kubernetes.io/projected/e36f83bb-fee1-4584-ae45-a4b6345b7c41-kube-api-access-ztd4k\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.284362 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e36f83bb-fee1-4584-ae45-a4b6345b7c41" (UID: "e36f83bb-fee1-4584-ae45-a4b6345b7c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.362019 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e36f83bb-fee1-4584-ae45-a4b6345b7c41-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.493770 4694 generic.go:334] "Generic (PLEG): container finished" podID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerID="7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b" exitCode=0 Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.493818 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerDied","Data":"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b"} Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.493851 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r66bf" event={"ID":"e36f83bb-fee1-4584-ae45-a4b6345b7c41","Type":"ContainerDied","Data":"10877a161dcc05b420d7ad801d56570f22cb4fed01aec4823c1165518e26d42e"} Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.493874 4694 scope.go:117] "RemoveContainer" containerID="7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.494027 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r66bf" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.529432 4694 scope.go:117] "RemoveContainer" containerID="5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.537296 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.547224 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r66bf"] Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.554800 4694 scope.go:117] "RemoveContainer" containerID="b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.599286 4694 scope.go:117] "RemoveContainer" containerID="7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b" Feb 17 17:48:56 crc kubenswrapper[4694]: E0217 17:48:56.600119 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b\": container with ID starting with 7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b not found: ID does not exist" containerID="7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.600171 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b"} err="failed to get container status \"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b\": rpc error: code = NotFound desc = could not find container \"7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b\": container with ID starting with 7e141cd96542ce6eb66c971303dbbddcce11756e9fdbfa9f3ea99d177d1c972b not found: ID does not exist" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.600199 4694 scope.go:117] "RemoveContainer" containerID="5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923" Feb 17 17:48:56 crc kubenswrapper[4694]: E0217 17:48:56.600552 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923\": container with ID starting with 5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923 not found: ID does not exist" containerID="5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.600574 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923"} err="failed to get container status \"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923\": rpc error: code = NotFound desc = could not find container \"5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923\": container with ID starting with 5350630971d445ac48e5bce68db5a1ee4810d891067d9aabe44225c323b2e923 not found: ID does not exist" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.600586 4694 scope.go:117] "RemoveContainer" containerID="b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c" Feb 17 17:48:56 crc kubenswrapper[4694]: E0217 17:48:56.600850 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c\": container with ID starting with b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c not found: ID does not exist" containerID="b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.600894 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c"} err="failed to get container status \"b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c\": rpc error: code = NotFound desc = could not find container \"b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c\": container with ID starting with b8ea9bc8ffc1a9d538610a87035bf86d71edc5064a8b0d1c2eb373e9e9cedd2c not found: ID does not exist" Feb 17 17:48:56 crc kubenswrapper[4694]: I0217 17:48:56.905955 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" path="/var/lib/kubelet/pods/e36f83bb-fee1-4584-ae45-a4b6345b7c41/volumes" Feb 17 17:49:13 crc kubenswrapper[4694]: I0217 17:49:13.958329 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.129042 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.138140 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.249554 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.397041 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/pull/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.470232 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/extract/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.487276 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c9235dd58de8d3cfbcd01f85777c38f5251b6fd6e3e8375e68ea86da5dlqqrf_1ee81c0e-d67b-4713-bdb1-62d8092358ec/util/0.log" Feb 17 17:49:14 crc kubenswrapper[4694]: I0217 17:49:14.869657 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-dkblx_6df3483a-cb85-4da5-a314-e0aea8874af8/manager/0.log" Feb 17 17:49:15 crc kubenswrapper[4694]: I0217 17:49:15.200017 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68c6d499cb-pkrtr_4f4cd3e2-6744-4dd9-8b67-f934bcd3cfcd/manager/0.log" Feb 17 17:49:15 crc kubenswrapper[4694]: I0217 17:49:15.410816 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-cc8kn_17fc1c20-dd01-4529-99a2-5a758dd7d8f1/manager/0.log" Feb 17 17:49:15 crc kubenswrapper[4694]: I0217 17:49:15.641150 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-5s57w_9690ac41-b444-4d39-adfb-de4dd1b4d581/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.109573 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-5zb28_73478819-b2d9-484f-8f31-12636ce0fff1/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.295827 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-f8krc_37924017-8b0a-4920-becb-89e528139e25/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.615854 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-wj245_42a9eede-5ff2-40da-9491-44a8012320a2/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.771183 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-96fff9cb8-4qw8g_df099335-bb95-4e69-9628-9f53a170b043/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.901597 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-57746b5ff9-fk2xd_b43c51f9-625e-4499-bdcc-612213a353df/manager/0.log" Feb 17 17:49:16 crc kubenswrapper[4694]: I0217 17:49:16.968522 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66997756f6-z7ldj_80223b5f-ef20-4fc6-b4bc-b8d63046db41/manager/0.log" Feb 17 17:49:17 crc kubenswrapper[4694]: I0217 17:49:17.197810 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-kqwxc_4ab6ce1a-7a26-4b71-be84-b27da7acf5c4/manager/0.log" Feb 17 17:49:17 crc kubenswrapper[4694]: I0217 17:49:17.390509 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5ddd85db87-8zxvf_c97c1c8a-6504-4db2-ad45-0a0c2f84551f/manager/0.log" Feb 17 17:49:17 crc kubenswrapper[4694]: I0217 17:49:17.636213 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-ttbkz_c9a788f4-d0b6-4275-9b3c-33f39fe70178/manager/0.log" Feb 17 17:49:18 crc kubenswrapper[4694]: I0217 17:49:18.365364 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5f8bcb546f-d72cj_8096d5be-2884-4a45-839b-1b2b20bc116d/operator/0.log" Feb 17 17:49:18 crc kubenswrapper[4694]: I0217 17:49:18.593865 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hlsbt_6022b0aa-9d87-47ae-8e99-4f71ef252803/registry-server/0.log" Feb 17 17:49:18 crc kubenswrapper[4694]: I0217 17:49:18.847812 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-pj5f2_db08b317-d51b-471e-a235-2c9b4cd1f6f7/manager/0.log" Feb 17 17:49:19 crc kubenswrapper[4694]: I0217 17:49:19.060481 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-xgzng_3629dece-1e4c-40cf-bb56-d15d6ca8aa44/manager/0.log" Feb 17 17:49:19 crc kubenswrapper[4694]: I0217 17:49:19.244898 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ggsvh_a65ffd3d-87f5-4491-b71e-9823c314bc1f/operator/0.log" Feb 17 17:49:19 crc kubenswrapper[4694]: I0217 17:49:19.888897 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-745bbbd77b-m2ntk_26d16927-5afd-4da9-a66a-c20006f1d9e7/manager/0.log" Feb 17 17:49:19 crc kubenswrapper[4694]: I0217 17:49:19.971838 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-xgh9m_7eee970c-7e73-4420-adc3-331ee21c914c/manager/0.log" Feb 17 17:49:20 crc kubenswrapper[4694]: I0217 17:49:20.162619 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-7285z_97487951-8888-4a3b-91fc-76d324fdf255/manager/0.log" Feb 17 17:49:20 crc kubenswrapper[4694]: I0217 17:49:20.167917 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dbd849dbc-2qx8n_607606da-e993-49b1-98ff-a2e0c2146f8a/manager/0.log" Feb 17 17:49:20 crc kubenswrapper[4694]: I0217 17:49:20.198800 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-w98pl_ad0948c6-64ee-47a8-b5da-aaa3b8f051ef/manager/0.log" Feb 17 17:49:20 crc kubenswrapper[4694]: I0217 17:49:20.328672 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c469bc6bb-ch6kt_1e4d1433-71e3-4cc3-8873-1fa5cd78e961/manager/0.log" Feb 17 17:49:24 crc kubenswrapper[4694]: I0217 17:49:24.175736 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-xc6sk_77c2abe7-0f53-4b11-932c-1f767a6d21b2/manager/0.log" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.744254 4694 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:40 crc kubenswrapper[4694]: E0217 17:49:40.745167 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="extract-content" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.745181 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="extract-content" Feb 17 17:49:40 crc kubenswrapper[4694]: E0217 17:49:40.745202 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="extract-utilities" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.745209 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="extract-utilities" Feb 17 17:49:40 crc kubenswrapper[4694]: E0217 17:49:40.745221 4694 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="registry-server" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.745229 4694 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="registry-server" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.745407 4694 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36f83bb-fee1-4584-ae45-a4b6345b7c41" containerName="registry-server" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.746621 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.758246 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.788743 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.788792 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.788840 4694 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qkc\" (UniqueName: \"kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.890504 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.890581 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.890659 4694 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qkc\" (UniqueName: \"kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.891672 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.891955 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:40 crc kubenswrapper[4694]: I0217 17:49:40.923567 4694 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qkc\" (UniqueName: \"kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc\") pod \"certified-operators-jkd9f\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:41 crc kubenswrapper[4694]: I0217 17:49:41.064577 4694 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:41 crc kubenswrapper[4694]: I0217 17:49:41.554747 4694 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:41 crc kubenswrapper[4694]: I0217 17:49:41.885312 4694 generic.go:334] "Generic (PLEG): container finished" podID="9719362f-3117-423a-8cb6-b1f0e8c8ced1" containerID="2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc" exitCode=0 Feb 17 17:49:41 crc kubenswrapper[4694]: I0217 17:49:41.885573 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerDied","Data":"2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc"} Feb 17 17:49:41 crc kubenswrapper[4694]: I0217 17:49:41.885600 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerStarted","Data":"8114124306a4dcf41000f1518bd1c4bc1b24ea1e70a62c72afac7083d5a8a104"} Feb 17 17:49:42 crc kubenswrapper[4694]: I0217 17:49:42.200044 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zjbh5_cbc2bd3c-fb82-4835-9103-f7bf30e51f17/control-plane-machine-set-operator/0.log" Feb 17 17:49:42 crc kubenswrapper[4694]: I0217 17:49:42.364242 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t44rx_5733b257-6fe2-4df1-aa83-4eaf3a84fdcc/kube-rbac-proxy/0.log" Feb 17 17:49:42 crc kubenswrapper[4694]: I0217 17:49:42.411439 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t44rx_5733b257-6fe2-4df1-aa83-4eaf3a84fdcc/machine-api-operator/0.log" Feb 17 17:49:42 crc kubenswrapper[4694]: I0217 17:49:42.893849 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerStarted","Data":"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645"} Feb 17 17:49:43 crc kubenswrapper[4694]: I0217 17:49:43.905477 4694 generic.go:334] "Generic (PLEG): container finished" podID="9719362f-3117-423a-8cb6-b1f0e8c8ced1" containerID="a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645" exitCode=0 Feb 17 17:49:43 crc kubenswrapper[4694]: I0217 17:49:43.905595 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerDied","Data":"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645"} Feb 17 17:49:44 crc kubenswrapper[4694]: I0217 17:49:44.916495 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerStarted","Data":"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11"} Feb 17 17:49:44 crc kubenswrapper[4694]: I0217 17:49:44.943469 4694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jkd9f" podStartSLOduration=2.5035815059999997 podStartE2EDuration="4.943450031s" podCreationTimestamp="2026-02-17 17:49:40 +0000 UTC" firstStartedPulling="2026-02-17 17:49:41.887291344 +0000 UTC m=+4049.644366668" lastFinishedPulling="2026-02-17 17:49:44.327159869 +0000 UTC m=+4052.084235193" observedRunningTime="2026-02-17 17:49:44.937860731 +0000 UTC m=+4052.694936055" watchObservedRunningTime="2026-02-17 17:49:44.943450031 +0000 UTC m=+4052.700525345" Feb 17 17:49:51 crc kubenswrapper[4694]: I0217 17:49:51.064769 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:51 crc kubenswrapper[4694]: I0217 17:49:51.065519 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:51 crc kubenswrapper[4694]: I0217 17:49:51.199413 4694 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:52 crc kubenswrapper[4694]: I0217 17:49:52.037336 4694 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:52 crc kubenswrapper[4694]: I0217 17:49:52.099452 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:53 crc kubenswrapper[4694]: I0217 17:49:53.998070 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jkd9f" podUID="9719362f-3117-423a-8cb6-b1f0e8c8ced1" containerName="registry-server" containerID="cri-o://91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11" gracePeriod=2 Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.464604 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.563252 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qkc\" (UniqueName: \"kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc\") pod \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.563420 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities\") pod \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.563490 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content\") pod \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\" (UID: \"9719362f-3117-423a-8cb6-b1f0e8c8ced1\") " Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.564200 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities" (OuterVolumeSpecName: "utilities") pod "9719362f-3117-423a-8cb6-b1f0e8c8ced1" (UID: "9719362f-3117-423a-8cb6-b1f0e8c8ced1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.569395 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc" (OuterVolumeSpecName: "kube-api-access-z7qkc") pod "9719362f-3117-423a-8cb6-b1f0e8c8ced1" (UID: "9719362f-3117-423a-8cb6-b1f0e8c8ced1"). InnerVolumeSpecName "kube-api-access-z7qkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.627787 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9719362f-3117-423a-8cb6-b1f0e8c8ced1" (UID: "9719362f-3117-423a-8cb6-b1f0e8c8ced1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.666552 4694 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.666602 4694 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9719362f-3117-423a-8cb6-b1f0e8c8ced1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 17:49:54 crc kubenswrapper[4694]: I0217 17:49:54.666647 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qkc\" (UniqueName: \"kubernetes.io/projected/9719362f-3117-423a-8cb6-b1f0e8c8ced1-kube-api-access-z7qkc\") on node \"crc\" DevicePath \"\"" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.015001 4694 generic.go:334] "Generic (PLEG): container finished" podID="9719362f-3117-423a-8cb6-b1f0e8c8ced1" containerID="91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11" exitCode=0 Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.015099 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerDied","Data":"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11"} Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.015149 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkd9f" event={"ID":"9719362f-3117-423a-8cb6-b1f0e8c8ced1","Type":"ContainerDied","Data":"8114124306a4dcf41000f1518bd1c4bc1b24ea1e70a62c72afac7083d5a8a104"} Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.015184 4694 scope.go:117] "RemoveContainer" containerID="91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.015199 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkd9f" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.044167 4694 scope.go:117] "RemoveContainer" containerID="a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.047930 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.057128 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jkd9f"] Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.066209 4694 scope.go:117] "RemoveContainer" containerID="2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.115629 4694 scope.go:117] "RemoveContainer" containerID="91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11" Feb 17 17:49:55 crc kubenswrapper[4694]: E0217 17:49:55.116136 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11\": container with ID starting with 91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11 not found: ID does not exist" containerID="91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.116187 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11"} err="failed to get container status \"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11\": rpc error: code = NotFound desc = could not find container \"91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11\": container with ID starting with 91eaa92f57d2a45052debbe12c611d36c436253dd83fa0f40f7f5da7b78fcd11 not found: ID does not exist" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.116252 4694 scope.go:117] "RemoveContainer" containerID="a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645" Feb 17 17:49:55 crc kubenswrapper[4694]: E0217 17:49:55.116731 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645\": container with ID starting with a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645 not found: ID does not exist" containerID="a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.116756 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645"} err="failed to get container status \"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645\": rpc error: code = NotFound desc = could not find container \"a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645\": container with ID starting with a515811045c1c1efbde59438e2496cb0bdd825f5c6d750e488fc4ced68aa0645 not found: ID does not exist" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.116772 4694 scope.go:117] "RemoveContainer" containerID="2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc" Feb 17 17:49:55 crc kubenswrapper[4694]: E0217 17:49:55.117108 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc\": container with ID starting with 2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc not found: ID does not exist" containerID="2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc" Feb 17 17:49:55 crc kubenswrapper[4694]: I0217 17:49:55.117140 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc"} err="failed to get container status \"2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc\": rpc error: code = NotFound desc = could not find container \"2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc\": container with ID starting with 2e519b51ce30f08498e1f4b50a4c07e04e05af2dbb094d10d16ec82919fac7dc not found: ID does not exist" Feb 17 17:49:56 crc kubenswrapper[4694]: I0217 17:49:56.200366 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-k7pjr_a4ed5c6f-777c-4e48-acc5-335e03efbe15/cert-manager-controller/0.log" Feb 17 17:49:56 crc kubenswrapper[4694]: I0217 17:49:56.450311 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-45nhc_56fc387f-0f40-4de1-b4f5-628ecdecc25b/cert-manager-cainjector/0.log" Feb 17 17:49:56 crc kubenswrapper[4694]: I0217 17:49:56.513177 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-c29mn_da727533-6298-432b-9048-f78ff8faad8f/cert-manager-webhook/0.log" Feb 17 17:49:56 crc kubenswrapper[4694]: I0217 17:49:56.904745 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9719362f-3117-423a-8cb6-b1f0e8c8ced1" path="/var/lib/kubelet/pods/9719362f-3117-423a-8cb6-b1f0e8c8ced1/volumes" Feb 17 17:50:11 crc kubenswrapper[4694]: I0217 17:50:11.743242 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-6xx9x_9f3c6b99-3db1-447f-b31c-1692a70ec415/nmstate-console-plugin/0.log" Feb 17 17:50:11 crc kubenswrapper[4694]: I0217 17:50:11.810961 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-t7txz_13812f03-334f-44fa-9c5c-c5d257756b27/nmstate-handler/0.log" Feb 17 17:50:11 crc kubenswrapper[4694]: I0217 17:50:11.923955 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-jsvrg_e83663da-470f-4ebf-ac6b-64612e8724f4/kube-rbac-proxy/0.log" Feb 17 17:50:11 crc kubenswrapper[4694]: I0217 17:50:11.999884 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-jsvrg_e83663da-470f-4ebf-ac6b-64612e8724f4/nmstate-metrics/0.log" Feb 17 17:50:12 crc kubenswrapper[4694]: I0217 17:50:12.085187 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-m65zv_c5fe0950-b548-4cdc-9e9d-c2483a8213d9/nmstate-operator/0.log" Feb 17 17:50:12 crc kubenswrapper[4694]: I0217 17:50:12.183841 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-whp8f_aa8c8617-1bdc-461a-9aea-d534da85b5e4/nmstate-webhook/0.log" Feb 17 17:50:39 crc kubenswrapper[4694]: I0217 17:50:39.945263 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-nhxz7_13962e8e-e444-4010-912e-9c953c8f7b8f/kube-rbac-proxy/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.089500 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-nhxz7_13962e8e-e444-4010-912e-9c953c8f7b8f/controller/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.148663 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.331323 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.355320 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.361924 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.368309 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.525217 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.525475 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.527555 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.572430 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.795032 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/controller/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.823205 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-metrics/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.828148 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-frr-files/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.847884 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/cp-reloader/0.log" Feb 17 17:50:40 crc kubenswrapper[4694]: I0217 17:50:40.967710 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/frr-metrics/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.039075 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/kube-rbac-proxy/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.039216 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/kube-rbac-proxy-frr/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.149464 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/reloader/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.273554 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-nzwnk_b1d3f052-b387-452c-a154-a4f7cd14a6b7/frr-k8s-webhook-server/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.815523 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d47c4d78b-gffhs_152f177f-c542-4114-9d7b-601185b129b2/manager/0.log" Feb 17 17:50:41 crc kubenswrapper[4694]: I0217 17:50:41.903759 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7756f55684-8twln_d2d8d1c1-ad28-45c6-8314-935e8c60b976/webhook-server/0.log" Feb 17 17:50:42 crc kubenswrapper[4694]: I0217 17:50:42.200850 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xlw27_e542a8da-05af-4ddd-95dc-cf10576c4658/kube-rbac-proxy/0.log" Feb 17 17:50:42 crc kubenswrapper[4694]: I0217 17:50:42.299158 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cmzh5_e61de6d5-a385-4bbb-8ddf-b4af95e92b3c/frr/0.log" Feb 17 17:50:42 crc kubenswrapper[4694]: I0217 17:50:42.500515 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xlw27_e542a8da-05af-4ddd-95dc-cf10576c4658/speaker/0.log" Feb 17 17:50:44 crc kubenswrapper[4694]: I0217 17:50:44.618454 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:50:44 crc kubenswrapper[4694]: I0217 17:50:44.618848 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:50:55 crc kubenswrapper[4694]: I0217 17:50:55.850284 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.069856 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.086803 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.138061 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.321587 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/util/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.330927 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/pull/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.361020 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2137pqkv_9d0c9969-a494-496f-bc4d-721e0a4ac013/extract/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.486862 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.655850 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.679775 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.682088 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.831518 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-utilities/0.log" Feb 17 17:50:56 crc kubenswrapper[4694]: I0217 17:50:56.842435 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/extract-content/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.146419 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.261037 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.304409 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.365416 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.413788 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hznv8_737fdc2f-4b41-4c22-bae3-2411c82d16af/registry-server/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.473547 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-utilities/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.508320 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/extract-content/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.675459 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.886874 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.928397 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:50:57 crc kubenswrapper[4694]: I0217 17:50:57.963192 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.109475 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/util/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.135509 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/extract/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.175134 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecarz6ct_664fb1f1-f1c6-483a-894d-8bf5ab3ec09f/pull/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.190276 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pb62z_35ca4ecd-4b3f-45f9-b620-a945b332d711/registry-server/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.324892 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w82n9_80752637-17b7-451f-a4f9-c15ff9d5bd47/marketplace-operator/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.386150 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.560160 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.571535 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.601353 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.749735 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-utilities/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.759926 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/extract-content/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.894408 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8g5wb_64364749-c028-459b-8099-dd62cae9a8a1/registry-server/0.log" Feb 17 17:50:58 crc kubenswrapper[4694]: I0217 17:50:58.945692 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.109547 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.112877 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.155296 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.263272 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-utilities/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.361775 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/extract-content/0.log" Feb 17 17:50:59 crc kubenswrapper[4694]: I0217 17:50:59.791161 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxhhr_0bc3455e-a064-46d4-9504-b6347f5508d5/registry-server/0.log" Feb 17 17:51:14 crc kubenswrapper[4694]: I0217 17:51:14.618209 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:51:14 crc kubenswrapper[4694]: I0217 17:51:14.618796 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.618704 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.619398 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.619489 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.620699 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.620825 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b" gracePeriod=600 Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.993527 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b" exitCode=0 Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.993804 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b"} Feb 17 17:51:44 crc kubenswrapper[4694]: I0217 17:51:44.993839 4694 scope.go:117] "RemoveContainer" containerID="d0c7c917d321b77e6aafd99f190d345d0b226d703f204b0e0a3914ce5e0f603e" Feb 17 17:51:46 crc kubenswrapper[4694]: I0217 17:51:46.005718 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerStarted","Data":"eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac"} Feb 17 17:52:46 crc kubenswrapper[4694]: I0217 17:52:46.621535 4694 generic.go:334] "Generic (PLEG): container finished" podID="7a9b22ef-bb0b-4584-96b2-d4c8178c5030" containerID="7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059" exitCode=0 Feb 17 17:52:46 crc kubenswrapper[4694]: I0217 17:52:46.621685 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" event={"ID":"7a9b22ef-bb0b-4584-96b2-d4c8178c5030","Type":"ContainerDied","Data":"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059"} Feb 17 17:52:46 crc kubenswrapper[4694]: I0217 17:52:46.623803 4694 scope.go:117] "RemoveContainer" containerID="7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059" Feb 17 17:52:47 crc kubenswrapper[4694]: I0217 17:52:47.395491 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qxn6_must-gather-bbwm4_7a9b22ef-bb0b-4584-96b2-d4c8178c5030/gather/0.log" Feb 17 17:52:57 crc kubenswrapper[4694]: I0217 17:52:57.947947 4694 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6qxn6/must-gather-bbwm4"] Feb 17 17:52:57 crc kubenswrapper[4694]: I0217 17:52:57.948918 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" podUID="7a9b22ef-bb0b-4584-96b2-d4c8178c5030" containerName="copy" containerID="cri-o://b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7" gracePeriod=2 Feb 17 17:52:57 crc kubenswrapper[4694]: I0217 17:52:57.958979 4694 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6qxn6/must-gather-bbwm4"] Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.442885 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qxn6_must-gather-bbwm4_7a9b22ef-bb0b-4584-96b2-d4c8178c5030/copy/0.log" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.443667 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.590741 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output\") pod \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.591092 4694 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54m9v\" (UniqueName: \"kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v\") pod \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\" (UID: \"7a9b22ef-bb0b-4584-96b2-d4c8178c5030\") " Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.596988 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v" (OuterVolumeSpecName: "kube-api-access-54m9v") pod "7a9b22ef-bb0b-4584-96b2-d4c8178c5030" (UID: "7a9b22ef-bb0b-4584-96b2-d4c8178c5030"). InnerVolumeSpecName "kube-api-access-54m9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.693178 4694 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54m9v\" (UniqueName: \"kubernetes.io/projected/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-kube-api-access-54m9v\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.732496 4694 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6qxn6_must-gather-bbwm4_7a9b22ef-bb0b-4584-96b2-d4c8178c5030/copy/0.log" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.732976 4694 generic.go:334] "Generic (PLEG): container finished" podID="7a9b22ef-bb0b-4584-96b2-d4c8178c5030" containerID="b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7" exitCode=143 Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.733049 4694 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6qxn6/must-gather-bbwm4" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.733057 4694 scope.go:117] "RemoveContainer" containerID="b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.755317 4694 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7a9b22ef-bb0b-4584-96b2-d4c8178c5030" (UID: "7a9b22ef-bb0b-4584-96b2-d4c8178c5030"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.766690 4694 scope.go:117] "RemoveContainer" containerID="7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.794918 4694 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a9b22ef-bb0b-4584-96b2-d4c8178c5030-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.813110 4694 scope.go:117] "RemoveContainer" containerID="b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7" Feb 17 17:52:58 crc kubenswrapper[4694]: E0217 17:52:58.813564 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7\": container with ID starting with b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7 not found: ID does not exist" containerID="b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.813649 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7"} err="failed to get container status \"b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7\": rpc error: code = NotFound desc = could not find container \"b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7\": container with ID starting with b67d74d918808f21f0677033349999b874106d8eb22ee42a35ebbe6ddb9553c7 not found: ID does not exist" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.813684 4694 scope.go:117] "RemoveContainer" containerID="7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059" Feb 17 17:52:58 crc kubenswrapper[4694]: E0217 17:52:58.816543 4694 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059\": container with ID starting with 7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059 not found: ID does not exist" containerID="7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.816591 4694 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059"} err="failed to get container status \"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059\": rpc error: code = NotFound desc = could not find container \"7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059\": container with ID starting with 7a31da59cc269d077f4ee8f18d111b586df6fd4c6ad3c1b51d8ad2c5cd99c059 not found: ID does not exist" Feb 17 17:52:58 crc kubenswrapper[4694]: I0217 17:52:58.905489 4694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9b22ef-bb0b-4584-96b2-d4c8178c5030" path="/var/lib/kubelet/pods/7a9b22ef-bb0b-4584-96b2-d4c8178c5030/volumes" Feb 17 17:53:43 crc kubenswrapper[4694]: I0217 17:53:43.946487 4694 scope.go:117] "RemoveContainer" containerID="9c61d4e84d11b236257e33c3f0d447c4b24496cb630cfb89e4ba9ef5f4be7ce6" Feb 17 17:54:14 crc kubenswrapper[4694]: I0217 17:54:14.617925 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:54:14 crc kubenswrapper[4694]: I0217 17:54:14.618574 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:54:44 crc kubenswrapper[4694]: I0217 17:54:44.618584 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:54:44 crc kubenswrapper[4694]: I0217 17:54:44.619531 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:55:14 crc kubenswrapper[4694]: I0217 17:55:14.617565 4694 patch_prober.go:28] interesting pod/machine-config-daemon-b5hgc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 17:55:14 crc kubenswrapper[4694]: I0217 17:55:14.618431 4694 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 17:55:14 crc kubenswrapper[4694]: I0217 17:55:14.618512 4694 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" Feb 17 17:55:14 crc kubenswrapper[4694]: I0217 17:55:14.619841 4694 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac"} pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 17:55:14 crc kubenswrapper[4694]: I0217 17:55:14.619964 4694 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerName="machine-config-daemon" containerID="cri-o://eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" gracePeriod=600 Feb 17 17:55:14 crc kubenswrapper[4694]: E0217 17:55:14.752304 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:55:15 crc kubenswrapper[4694]: I0217 17:55:15.060187 4694 generic.go:334] "Generic (PLEG): container finished" podID="05e7e385-beb4-4e06-8718-fd68e90ba74e" containerID="eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" exitCode=0 Feb 17 17:55:15 crc kubenswrapper[4694]: I0217 17:55:15.060262 4694 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" event={"ID":"05e7e385-beb4-4e06-8718-fd68e90ba74e","Type":"ContainerDied","Data":"eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac"} Feb 17 17:55:15 crc kubenswrapper[4694]: I0217 17:55:15.060339 4694 scope.go:117] "RemoveContainer" containerID="5b5174c4ff864458d1d7894579d126fb8403a1564a80db04be68f3c393a19a1b" Feb 17 17:55:15 crc kubenswrapper[4694]: I0217 17:55:15.061863 4694 scope.go:117] "RemoveContainer" containerID="eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" Feb 17 17:55:15 crc kubenswrapper[4694]: E0217 17:55:15.062353 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:55:25 crc kubenswrapper[4694]: I0217 17:55:25.897375 4694 scope.go:117] "RemoveContainer" containerID="eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" Feb 17 17:55:25 crc kubenswrapper[4694]: E0217 17:55:25.898379 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:55:37 crc kubenswrapper[4694]: I0217 17:55:37.895299 4694 scope.go:117] "RemoveContainer" containerID="eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" Feb 17 17:55:37 crc kubenswrapper[4694]: E0217 17:55:37.896197 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e" Feb 17 17:55:51 crc kubenswrapper[4694]: I0217 17:55:51.895210 4694 scope.go:117] "RemoveContainer" containerID="eb79e29ed3c0951c20bbe3ce69237d67d778d176c7742738b4d9e5796f87b4ac" Feb 17 17:55:51 crc kubenswrapper[4694]: E0217 17:55:51.896154 4694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b5hgc_openshift-machine-config-operator(05e7e385-beb4-4e06-8718-fd68e90ba74e)\"" pod="openshift-machine-config-operator/machine-config-daemon-b5hgc" podUID="05e7e385-beb4-4e06-8718-fd68e90ba74e"